Decoding methods: Difference between revisions

Content deleted Content added
Maximum likelihood decoding: reword (original explanation was back to front)
Line 27:
:<math>\mathbb{P}(x \mbox{ received} \mid y \mbox{ sent})</math>
 
i.e. choose the codeword <math>y</math> that wasmaximizes mostthe likelyprobability tothat have<math>x</math> beenwas sentreceived, [[conditional probability|given that]] <math>xy</math> was receivedsent. Note that if all codewords are equally likely to be sent during ordinary use, then this scheme is equivalent to ''ideal observer decoding'', by [[Bayes Theorem]]:
 
:<math>
Line 39:
As with ''ideal observer decoding'', a convention must be agreed to for non-unique decoding.
 
The ML decoding problem can also be modeled as an [[integer programming]] problem.<ref name = feldman> "Using linear programming to Decode Binary linear codes," J.Feldman, M.J.Wainwright and D.R.Karger, IEEE Transactions on Information Theory, 51:954-972, March 2005.</ref>
 
==Minimum distance decoding==