Content deleted Content added
m HTTP→HTTPS |
No edit summary |
||
Line 27:
The rate of a code refers to the ratio between its message length and codeword length: <math>\frac{|x|}{|C(x)|}</math>, and the number of queries required to recover 1 bit of the message is called the query complexity of a code.
The rate of a code is inversely related to the query complexity, but the exact shape of this tradeoff is a major open problem.<ref name=LDC1>{{cite web|url=http://research.microsoft.com/en-us/um/people/yekhanin/Papers/LDC_now.pdf |format=PDF |title=Locally Decodable Codes |author=Sergey Yekhanin}}</ref><ref name=LDC2>{{cite web|url=https://www.iacr.org/workshops/tcc2012/survey_tcc.pdf |format=PDF |title=Locally Decodable Codes |author=Sergey Yekhanin}}</ref> It is known that there are no LDCs that query the codeword in only one position, and that the optimal codeword size for query complexity 2 is exponential in the size of the original message.<ref name=LDC1/> However, there are no known tight lower bounds for codes with query complexity greater than 2. Approaching the tradeoff from the side of codeword length, the only known codes with codeword length proportional to message length have query complexity <math>k^\epsilon</math> for <math>\epsilon > 0</math> (THIS LINE IS NOW OUTDATED -- MANY RECENT RESULTS SINCE 2011).<ref name=LDC1/> There are also codes in between, that have codewords polynomial in the size of the original message and polylogarithmic query complexity.<ref name=LDC1/>
== Applications ==
|