Content deleted Content added
Linting fix Tags: Mobile edit Mobile web edit Advanced mobile edit |
Article use. |
||
Line 1:
{{Short description|Algorithm for analyzing noisy data streams}}
'''Maximum likelihood sequence estimation''' ('''MLSE''') is a [[mathematical algorithm]]
==Theory==
Line 7:
==Background==
Suppose that there is an underlying signal {''x''(''t'')}, of which an observed signal {''r''(''t'')} is available. The observed signal ''r'' is related to ''x'' via a transformation that may be nonlinear and may involve attenuation, and would usually involve the incorporation of [[random noise]]. The [[statistical parameter]]s of this transformation are assumed to be known. The problem to be solved is to use the observations {''r''(''t'')} to create a good estimate of {''x''(''t'')}.
Maximum likelihood sequence estimation is formally the application of [[maximum likelihood]] to this problem. That is, the estimate of {''x''(''t'')} is defined to be a sequence of values which maximize the functional
:<math>L(x)=p(r\mid x),</math>
where ''p''(''r'' | ''x'') denotes the conditional joint probability density function of the observed series {''r''(''t'')} given that the underlying series has the values {''x''(''t'')}.
In contrast, the related method of maximum a posteriori estimation is formally the application of the [[maximum a posteriori]] (MAP) estimation approach. This is more complex than maximum likelihood sequence estimation and requires a known distribution (in [[Bayesian inference|Bayesian terms]], a [[prior distribution]]) for the underlying signal. In this case the estimate of {''x''(''t'')} is defined to be a sequence of values which maximize the functional
:<math>P(x)=p(x\mid r),</math>
where ''p''(''x'' | ''r'') denotes the conditional joint probability density function of the underlying series {''x''(''t'')} given that the observed series has taken the values {''r''(''t'')}. [[Bayes' theorem]] implies that
|