Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
184 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Two-Sided Weak Submodularity for Matroid Constrained Optimization and Regression (2102.09644v2)

Published 18 Feb 2021 in cs.DS

Abstract: We study the following problem: Given a variable of interest, we would like to find a best linear predictor for it by choosing a subset of $k$ relevant variables obeying a matroid constraint. This problem is a natural generalization of subset selection problems where it is necessary to spread observations amongst multiple different classes. We derive new, strengthened guarantees for this problem by improving the analysis of the residual random greedy algorithm and by developing a novel distorted local-search algorithm. To quantify our approximation guarantees, we refine the definition of weak submodularity by Das and Kempe and introduce the notion of an upper submodularity ratio, which we connect to the minimum $k$-sparse eigenvalue of the covariance matrix. More generally, we look at the problem of maximizing a set function $f$ with lower and upper submodularity ratio $\gamma$ and $\beta$ under a matroid constraint. For this problem, our algorithms have asymptotic approximation guarantee $1/2$ and $1-e{-1}$ as the function is closer to being submodular. As a second application, we show that the Bayesian A-optimal design objective falls into our framework, leading to new guarantees for this problem as well.

Citations (3)

Summary

We haven't generated a summary for this paper yet.