Time–frequency analysis: Difference between revisions

Content deleted Content added
Purchan (talk | contribs)
Tag: Reverted
WikiCleanerBot (talk | contribs)
m v2.05b - Bot T20 CW#61 - Fix errors for CW project (Reference before punctuation)
 
(38 intermediate revisions by 17 users not shown)
Line 1:
{{Short description|Techniques and methods in signal processing}}
{{See also|Time–frequency representation}}
In [[signal processing]], '''time–frequency analysis''' comprises those techniques that study a signal in both the time and frequency domains ''simultaneously,'' using various [[time–frequency representation]]s. Rather than viewing a 1-dimensional signal (a function, real or complex-valued, whose ___domain is the real line) and some transform (another function whose ___domain is the real line, obtained from the original via some transform), time–frequency analysis studies a two-dimensional signal – a function whose ___domain is the two-dimensional real plane, obtained from the signal via a time–frequency transform.<ref>L. Cohen, "Time–Frequency Analysis," ''Prentice-Hall'', New York, 1995. {{isbn|978-0135945322}}</ref><ref>E. Sejdić, I. Djurović, J. Jiang, “Time-frequency feature representation using energy concentration: An overview of recent advances,” Digital Signal Processing, vol. 19, no. 1, pp. 153-183, January 2009.</ref>
 
The mathematical motivation for this study is that functions and their transform representation are tightly connected, and they can be understood better by studying them jointly, as a two-dimensional object, rather than separately. A simple example is that the 4-fold periodicity of the [[Fourier transform]] – and the fact that two-fold Fourier transform reverses direction – can be interpreted by considering the Fourier transform as a 90° rotation in the associated time–frequency plane: 4 such rotations yield the identity, and 2 such rotations simply reverse direction ([[reflection through the origin]]).
Line 6 ⟶ 7:
The practical motivation for time–frequency analysis is that classical [[Fourier analysis]] assumes that signals are infinite in time or periodic, while many signals in practice are of short duration, and change substantially over their duration. For example, traditional musical instruments do not produce infinite duration sinusoids, but instead begin with an attack, then gradually decay. This is poorly represented by traditional methods, which motivates time–frequency analysis.
 
One of the most basic forms of time–frequency analysis is the [[short-time Fourier transform]] (STFT), but more sophisticated techniques have been developed, notably [[wavelet]]s and [[least-squares spectral analysis]] methods for unevenly spaced data.
 
==Motivation==
Line 27 ⟶ 28:
<!--
Image with unknown copyright status removed: [[Image:ft_vs_gt.jpg]] -->
 
[[File:X1(t).jpg|thumb]]
 
Once such a representation has been generated other techniques in time–frequency analysis may then be applied to the signal in order to extract information from the signal, to separate the signal from noise or interfering signals, etc.
Line 54 ⟶ 57:
#'''Lower computational complexity''' to ensure the time needed to represent and process a signal on a time–frequency plane allows real-time implementations.
 
Below is a brief comparison of some selected time–frequency distribution functions.<ref>{{Cite journal|lastlast1=Shafi|firstfirst1=Imran|last2=Ahmad|first2=Jamil|last3=Shah|first3=Syed Ismail|last4=Kashif|first4=F. M.|date=2009-06-09|title=Techniques to Obtain Good Resolution and Concentrated Time-Frequency Distributions: A Review|journal=EURASIP Journal on Advances in Signal Processing|language=en|volume=2009|issue=1|pages=673539|doi=10.1155/2009/673539|bibcode=2009EJASP2009..109S |issn=1687-6180|doi-access=free|hdl=1721.1/50243|hdl-access=free}}</ref>
 
{| class="wikitable"
Line 103 ⟶ 106:
\cos(3 \pi t); & t > 20
\end{cases}</math>
[[File:X1-x2.jpg|thumb]]
 
 
But time–frequency analysis can.
 
== TF analysis and random processes<ref>{{Cite book |last=Ding |first=Jian-Jiun |title=Time frequency analysis and wavelet transform class notes |publisher=Graduate Institute of Communication Engineering, National Taiwan University (NTU) |year=2022 |___location=Taipei, Taiwan}}</ref> ==
For a random process x(t), we cannot find the explicit value of x(t).
 
The value of x(t) is expressed as a probability function.
 
=== General random processes ===
 
* Auto-covariance function (ACF) <math>R_x(t,\tau)</math>
:<math>R_x(t,\tau) = E[x(t+\tau/2)x^*(t-\tau/2)]</math>
:In usual, we suppose that <math>E[x(t)] = 0 </math> for any t,
 
:<math>E[x(t+\tau/2)x^*(t-\tau/2)]</math>
:<math>=\iint x(t+\tau/2,\xi_1)x^*(t-\tau/2,\xi_2)P(\xi_1,\xi_2)d\xi_1d\xi_2</math>
:(alternative definition of the auto-covariance function)
:<math>\overset{\land}{R_x}(t,\tau)=E[x(t)x(t+\tau)]</math>
* Power spectral density (PSD) <math>S_x(t,f)</math>
:<math>S_x(t,f) = \int_{-\infty}^{\infty} R_x(t,\tau)e^{-j2\pi f\tau}d\tau</math>
* Relation between the [[Wigner distribution function|WDF (Wigner Distribution Function)]] and the PSD
:<math>E[W_x(t,f)] = \int_{-\infty}^{\infty} E[x(t+\tau/2)x^*(t-\tau/2)]\cdot e^{-j2\pi f\tau}\cdot d\tau</math>
:::::<math>= \int_{-\infty}^{\infty} R_x(t,\tau)\cdot e^{-j2\pi f\tau}\cdot d\tau</math><math>= S_x(t,f)</math>
* Relation between the [[ambiguity function]] and the ACF
:<math>E[A_X(\eta,\tau)] = \int_{-\infty}^{\infty} E[x(t+\tau/2)x^*(t-\tau/2)]e^{-j2\pi t\eta}dt</math>
:::::<math>= \int_{-\infty}^{\infty} R_x(t,\tau)e^{-j2\pi t\eta}dt</math>
 
=== Stationary random processes ===
* [[Stationary process|Stationary random process]]: the statistical properties do not change with t. Its auto-covariance function:
<math>R_x(t_1,\tau) = R_x(t_2,\tau) = R_x(\tau)</math> for any <math>t</math>, Therefore,
<math>R_x(\tau) = E[x(\tau/2)x^*(-\tau/2)]</math>
<math>=\iint x(\tau/2,\xi_1)x^*(-\tau/2,\xi_2)P(\xi_1,\xi_2)d\xi_1d\xi_2</math>PSD,
<math>S_x(f) = \int_{-\infty}^{\infty} R_x(\tau)e^{-j2\pi f\tau}d\tau</math> White noise:
<math>S_x(f) = \sigma</math> , where <math>\sigma</math> is some constant.[[File:Stationary random process's WDF and AF.jpg|right|frameless|440x440px]]
* When x(t) is stationary,
<math>E[W_x(t,f)] = S_x(f)</math> , (invariant with <math>t</math>)
 
<math>E[A_x(\eta,\tau)] = \int_{-\infty}^{\infty} R_x(\tau)\cdot e^{-j2\pi t\eta}\cdot dt</math>
<math>= R_x(\tau)\int_{-\infty}^{\infty} e^{-j2\pi t\eta}\cdot dt</math><math>= R_x(\tau)\delta(\eta)</math> , (nonzero only when <math>\eta = 0</math>)
 
=== Additive white noise ===
* For additive white noise (AWN),
:<math>E[W_g(t,f)] = \sigma</math>
:<math>E[A_x(\eta,\tau)] = \sigma\delta(\tau)\delta(\eta)</math>
 
* Filter Design for a signal in additive white noise
[[File:Filter design for white noise.jpg|left|thumb|440x440px]]
 
 
 
 
 
<math>E_x</math>: energy of the signal
 
<math>A</math> : area of the time frequency distribution of the signal
 
The PSD of the white noise is <math>S_n(f) = \sigma</math>
 
 
<math>SNR \approx 10\log_{10}\frac{E_x}{\iint\limits_{(t,f)\in\text{signal part}} S_x(t,f)dtdf}</math>
 
<math>SNR \approx 10\log_{10}\frac{E_x}{\sigma\Alpha}</math>
 
=== Non-stationary random processes ===
* If <math>E[W_x(t,f)]</math> varies with <math>t</math> and <math>E[A_x(\eta,\tau)]</math> is nonzero when <math>\eta = 0</math>, then <math>x(t)</math> is a non-stationary random process.
* If
*# <math>h(t) = x_1(t)+x_2(t)+x_3(t)+......+x_k(t)</math>
*# <math>x_n(t)</math>'s have zero mean for all <math>t</math>'s
*# <math>x_n(t)</math>'s are mutually independent for all <math>t</math>'s and <math>\tau</math>'s
:then:
::<math>E[x_m(t+\tau/2)x_n^*(t-\tau/2)] = E[x_m(t+\tau/2)]E[x_n^*(t-\tau/2)] = 0</math>
 
* if <math>m \neq n</math>, then
 
::<math>E[W_h(t,f)] = \sum_{n=1}^k E[W_{x_n}(t,f)]</math>
::<math>E[A_h(\eta,\tau)] = \sum_{n=1}^k E[A_{x_n}(\eta,\tau)]</math>
 
=== Short-time Fourier transform ===
* Random process for [[Short-time Fourier transform|STFT (Short Time Fourier Transform)]]
<math>E[x(t)]\neq 0</math> should be satisfied. Otherwise,
<math>E[X(t,f)] = E[\int_{t-B}^{t+B} x(\tau)w(t-\tau)e^{-j2\pi f\tau}d\tau]</math>
<math>=\int_{t-B}^{t+B} E[x(\tau)]w(t-\tau)e^{-j2\pi f\tau}d\tau</math>for zero-mean random process, <math>E[X(t,f)] = 0</math>
 
* Decompose by the AF and the FRFT. Any non-stationary random process can be expressed as a summation of the fractional Fourier transform (or chirp multiplication) of stationary random process.
 
==Applications==
 
The following applications need not only the time–frequency distribution functions but also some operations to the signal. The [[Linear canonical transform]] (LCT) is really helpful. By LCTs, the shape and ___location on the time–frequency plane of a signal can be in the arbitrary form that we want it to be. For example, the LCTs can shift the time–frequency distribution to any ___location, dilate it in the horizontal and vertical direction without changing its area on the plane, shear (or twist) it, and rotate it ([[Fractional Fourier transform]]). This powerful operation, LCT, make it more flexible to analyze and apply the time–frequency distributions. The time-frequency analysis have been applied in various applications like, disease detection from biomedical signals and images, vital sign extraction from physiological signals, brain-computer interface from brain signals, machinery fault diagnosis from vibration signals, interference mitigation in spread spectrum communication systems.<ref>{{cite book |last1=Pachori |first1=Ram Bilas |title=Time-Frequency Analysis Techniques and Their Applications |publisher=CRC Press|url=https://www.routledge.com/Time-Frequency-Analysis-Techniques-and-their-Applications/Pachori/p/book/9781032435763?srsltid=AfmBOorjwRC4cJ-ABXieBsYLfFSmwQdGQ3GHNvL_O5pGnBMchjM8x7S8}}</ref><ref>{{cite book |last1=Boashash |first1=Boualem |title=Time-Frequency Signal Analysis and Processing: A Comprehensive Reference |publisher=Elsevier|url=https://www.sciencedirect.com/book/9780123984999/time-frequency-signal-analysis-and-processing}}</ref>
 
===Instantaneous frequency estimation===
Line 227 ⟶ 314:
In biomedicine, one can use time–frequency distribution to analyze the [[electromyography]] (EMG), [[electroencephalography]] (EEG), [[electrocardiogram]] (ECG) or [[otoacoustic emissions]] (OAEs).
 
==History==
In the early development of time-frequency analysis, the proposed concepts are mostly not applied to signal processing. Instead, they are developed mainly for physics (quantum or acoustics) and mathematics theories.
{{see also|History of wavelets}}
 
1909 marks the beginning of the development of the wavelet transform families. In that year, [[Alfréd Haar]] proposed the [[Haar transform]] to give an example of an orthonormal system for the space of [[square-integrable functions]] on the [[unit interval]] [0, 1] (The term ''wavelet'' hasn't been invented yet). Later in 1946, [[Dennis Gabor]] proposed [[Gabor atoms]] which are constructed similarly to wavelets and have similar applications. In 1975, George Zweig discovered the [[continuous wavelet transform]]. In 1988, [[Stephane Mallat]] and [[Yves Meyer]] proposed the multiresolution structure of the wavelet transform (which is the backbone of [[fast wavelet transform]]) while [[Ingrid Daubechies]] proposed the compact support [[orthogonal wavelet]]. Since then, the discrete wavelet transform started to be widely used in image processing. In 1992, Wilson et. al. proposed the generalized wavelet transform. In 1996, [[Ingrid Daubechies]] and Maes proposed the synchrosqueezing transform. In about 2000s, various wavelets were developed: [[chirplet]] by Bultan (1999), [[curvelet]] by Donoho and Candes (2000), [[bandlet]] by Mallet and Peyre (2002), [[contourlet]] by Do and Vetterli (2005) and [[shearlet]] by Kutyniok and Labate (2005). JPEG 2000, one of the image compressing standard proposed by ISO, is developed also in 2000.
 
Another families of time-frequency analysis – [[bilinear time–frequency distribution]] have their development started in 1932. At that time, [[Eugene Wigner]] proposed the [[Wigner distribution function]] to provide quantum correction for classical statistical mechanics in physics. Skipping to 1989, Hyung-Ill Choi and William J. Williams proposed the [[Choi-Williams distribution]]. Later in 1990, Zhao, Atlas, and Marks proposed the cone-shape distribution. Then in 1994, Boashash and O’Shea developed polynomial Wigner-Ville distributions.
 
The short-time Fourier transform families have their development started in 1946. In that year, [[Dennis Gabor]] proposed the [[Gabor transform]]. The shared mathematics (symplectic structure) of the [[Heisenberg uncertainty principle]] (quantum mechanics) in position-momentum plane and the [[Gabor limit]] (time–frequency analysis) in the time–frequency plane is then explored. After 1965, the development of the [[Cooley-Tukey FFT algorithm]] allows faster computations for STFT. Skipping to 1996, Stockwell, Mansinha, and Lowe proposed the [[S transform]]. This is then generalized by Pinnegar and Mansinha in 2003. In 2007, Zhong and Zeng proposed the multiscale STFT while Pei and Ding proposed the Gabor-Wigner transform.
 
Early work in time–frequency analysis can be seen in the [[Haar wavelet]]s (1909) of [[Alfréd Haar]], though these were not significantly applied to signal processing. More substantial work was undertaken by [[Dennis Gabor]], such as [[Gabor atom]]s (1947), an early form of [[wavelet]]s, and the [[Gabor transform]], a modified [[short-time Fourier transform]]. The [[Wigner–Ville distribution]] (Ville 1948, in a signal processing context) was another foundational step.
The [[Hilbert-Huang transform]] is yet another time-frequency analysis technique that does not belong to the three families above. It is proposed by [[Norden E. Huang]] in 1998, and see its application developments in signal processing, climate analysis, geology, economics, and speech in the 2000s.
 
Particularly in the 1930s and 1940s, early time–frequency analysis developed in concert with [[quantum mechanics]] (Wigner developed the Wigner–Ville distribution in 1932 in quantum mechanics, and Gabor was influenced by quantum mechanics – see [[Gabor atom]]); this is reflected in the shared mathematics of the position-momentum plane and the time–frequency plane – as in the [[Heisenberg uncertainty principle]] (quantum mechanics) and the [[Gabor limit]] (time–frequency analysis), ultimately both reflecting a [[Symplectic geometry|symplectic]] structure.
Some useful techniques for filter designs in time-frequency analysis, like the [[fractional Fourier transform]] and [[linear canonical transform]], are developed and connected to signal processing applications in the 1990s and 1970s respectively.
 
An early practical motivation for time–frequency analysis was the development of radar – see [[ambiguity function]].
One of the recent application of time-frequency analysis is signal identification with [[deep learning]] technique (2015 ~). Kang et. al. also proposed the wavelet convolutional neural network in 2017.
 
== See also ==
* [[ConeMotions in the time-shapefrequency distribution function]]
* [[Multiresolution analysis]]
* [[Spectral density estimation]]
* [[Time–frequency analysis for music signals]]
* [[Wavelet analysis]]
 
== References ==