Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
167 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

On the Hardness of Compressing Weights (2107.02554v1)

Published 6 Jul 2021 in cs.DS and cs.CC

Abstract: We investigate computational problems involving large weights through the lens of kernelization, which is a framework of polynomial-time preprocessing aimed at compressing the instance size. Our main focus is the weighted Clique problem, where we are given an edge-weighted graph and the goal is to detect a clique of total weight equal to a prescribed value. We show that the weighted variant, parameterized by the number of vertices $n$, is significantly harder than the unweighted problem by presenting an $O(n{3 - \varepsilon})$ lower bound on the size of the kernel, under the assumption that NP $\not \subseteq$ coNP/poly. This lower bound is essentially tight: we show that we can reduce the problem to the case with weights bounded by $2{O(n)}$, which yields a randomized kernel of $O(n3)$ bits. We generalize these results to the weighted $d$-Uniform Hyperclique problem, Subset Sum, and weighted variants of Boolean Constraint Satisfaction Problems (CSPs). We also study weighted minimization problems and show that weight compression is easier when we only want to preserve the collection of optimal solutions. Namely, we show that for node-weighted Vertex Cover on bipartite graphs it is possible to maintain the set of optimal solutions using integer weights from the range $[1, n]$, but if we want to maintain the ordering of the weights of all inclusion-minimal solutions, then weights as large as $2{\Omega(n)}$ are necessary.

Citations (3)

Summary

We haven't generated a summary for this paper yet.