Improved Lower Bounds for Approximating Parameterized Nearest Codeword and Related Problems under ETH
Abstract: In this paper we present a new gap-creating randomized self-reduction for parameterized Maximum Likelihood Decoding problem over $\mathbb{F}p$ ($k$-MLD$_p$). The reduction takes a $k$-MLD$_p$ instance with $k\cdot n$ vectors as input, runs in time $f(k)n{O(1)}$ for some computable function $f$, outputs a $(3/2-\varepsilon)$-Gap-$k'$-MLD$_p$ instance for any $\varepsilon>0$, where $k'=O(k2\log k)$. Using this reduction, we show that assuming the randomized Exponential Time Hypothesis (ETH), no algorithms can approximate $k$-MLD$_p$ (and therefore its dual problem $k$-NCP$_p$) within factor $(3/2-\varepsilon)$ in $f(k)\cdot n{o(\sqrt{k/\log k})}$ time for any $\varepsilon>0$. We then use reduction by Bhattacharyya, Ghoshal, Karthik and Manurangsi (ICALP 2018) to amplify the $(3/2-\varepsilon)$-gap to any constant. As a result, we show that assuming ETH, no algorithms can approximate $k$-NCP$_p$ and $k$-MDP$_p$ within $\gamma$-factor in $f(k)n{o(k{\varepsilon\gamma})}$ time for some constant $\varepsilon_\gamma>0$. Combining with the gap-preserving reduction by Bennett, Cheraghchi, Guruswami and Ribeiro (STOC 2023), we also obtain similar lower bounds for $k$-MDP$_p$, $k$-CVP$_p$ and $k$-SVP$_p$. These results improve upon the previous $f(k)n{\Omega(\mathsf{poly} \log k)}$ lower bounds for these problems under ETH using reductions by Bhattacharyya et al. (J.ACM 2021) and Bennett et al. (STOC 2023).
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.