Content deleted Content added
m Checkwiki 61 + General Fixes using AWB |
m →Complexity Theory: Fixing links to disambiguation pages using AWB |
||
Line 47:
=== Complexity Theory ===
One of the applications of locally decodable codes in [[Computational complexity theory|complexity theory]] is hardness amplification. Using LDCs with polynomial codeword length and polylogarithmic query complexity, one can take a function <math>L: \{0,1\}^n \rightarrow \{0,1\}</math> that is hard to solve on worst case inputs and design a function <math>L': \{0,1\}^N \rightarrow \{0,1\}</math> that is hard to compute on average case inputs.
Consider <math>L</math> limited to only length <math>t</math> inputs. Then we can see <math>L</math> as a binary string of length <math>2^t</math>, where each bit is <math>L(x)</math> for each <math>x \in \{ 0, 1\}^t</math>. We can use a polynomial length locally decodable code <math>C</math> with polylogarithmic query complexity that tolerates some constant fraction of errors to encode the string that represents <math>L</math> to create a new string of length <math>2^{O(t)} = 2^{t'}</math>. We think of this new string as defining a new problem <math>L'</math> on length <math>t'</math> inputs. If <math>L'</math> is easy to solve on average, that is, we can solve <math>L'</math> correctly on a large fraction <math>1 - \epsilon</math> of inputs, then by the properties of the LDC used to encode it, we can use <math>L'</math> to probabilistically compute <math>L</math> on all inputs. Thus, a solution to <math>L'</math> for most inputs would allow us to solve <math>L</math> on all inputs, contradicting our assumption that <math>L</math> is hard on worst case inputs.<ref name=AppCodingTheory/><ref name=LDC1/><ref>Section 19.4 of {{Cite book
|