On the Capacity of Locally Decodable Codes (1812.05566v1)
Abstract: A locally decodable code (LDC) maps $K$ source symbols, each of size $L_w$ bits, to $M$ coded symbols, each of size $L_x$ bits, such that each source symbol can be decoded from $N \leq M$ coded symbols. A perfectly smooth LDC further requires that each coded symbol is uniformly accessed when we decode any one of the messages. The ratio $L_w/L_x$ is called the symbol rate of an LDC. The highest possible symbol rate for a class of LDCs is called the capacity of that class. It is shown that given $K, N$, the maximum value of capacity of perfectly smooth LDCs, maximized over all code lengths $M$, is $C*=N\left(1+1/N+1/N2+\cdots+1/N{K-1}\right){-1}$. Furthermore, given $K, N$, the minimum code length $M$ for which the capacity of a perfectly smooth LDC is $C*$ is shown to be $M = NK$. Both of these results generalize to a broader class of LDCs, called universal LDCs. The results are then translated into the context of PIR${\max}$, i.e., Private Information Retrieval subject to maximum (rather than average) download cost metric. It is shown that the minimum upload cost of capacity achieving PIR${\max}$ schemes is $(K-1)\log N$. The results also generalize to a variation of the PIR problem, known as Repudiative Information Retrieval (RIR).