Papers
Topics
Authors
Recent
2000 character limit reached

The Gaussian min-max theorem in the Presence of Convexity

Published 20 Aug 2014 in cs.IT, math.IT, and math.PR | (1408.4837v2)

Abstract: Gaussian comparison theorems are useful tools in probability theory; they are essential ingredients in the classical proofs of many results in empirical processes and extreme value theory. More recently, they have been used extensively in the analysis of non-smooth optimization problems that arise in the recovery of structured signals from noisy linear observations. We refer to such problems as Primary Optimization (PO) problems. A prominent role in the study of the (PO) problems is played by Gordon's Gaussian min-max theorem (GMT) which provides probabilistic lower bounds on the optimal cost via a simpler Auxiliary Optimization (AO) problem. Motivated by resent work of M. Stojnic, we show that under appropriate convexity assumptions the (AO) problem allows one to tightly bound both the optimal cost, as well as the norm of the solution of the (PO). As an application, we use our result to develop a general framework to tightly characterize the performance (e.g. squared-error) of a wide class of convex optimization algorithms used in the context of noisy signal recovery.

Citations (35)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 18 likes about this paper.