Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 77 tok/s
Gemini 2.5 Pro 51 tok/s Pro
GPT-5 Medium 24 tok/s Pro
GPT-5 High 25 tok/s Pro
GPT-4o 94 tok/s Pro
Kimi K2 216 tok/s Pro
GPT OSS 120B 459 tok/s Pro
Claude Sonnet 4.5 35 tok/s Pro
2000 character limit reached

Optimality and computational barriers in variable selection under dependence (2510.03990v1)

Published 5 Oct 2025 in math.ST and stat.TH

Abstract: We study the optimal sample complexity of variable selection in linear regression under general design covariance, and show that subset selection is optimal while under standard complexity assumptions, efficient algorithms for this problem do not exist. Specifically, we analyze the variable selection problem and provide the optimal sample complexity with exact dependence on the problem parameters for both known and unknown sparsity settings. Moreover, we establish a sample complexity lower bound for any efficient estimator, highlighting a gap between the statistical efficiency achievable by combinatorial algorithms (such as subset selection) compared to efficient algorithms (such as those based on convex programming). The proofs rely on a finite-sample analysis of an information criterion estimator, which may be of independent interest. Our results emphasize the optimal position of subset selection, the critical role played by restricted eigenvalues, and characterize the statistical-computational trade-off in high-dimensional variable selection.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 post and received 6 likes.