Papers
Topics
Authors
Recent
Search
2000 character limit reached

High dimensional errors-in-variables models with dependent measurements

Published 9 Feb 2015 in math.ST, stat.ML, and stat.TH | (1502.02355v2)

Abstract: Suppose that we observe $y \in \mathbb{R}f$ and $X \in \mathbb{R}{f \times m}$ in the following errors-in-variables model: \begin{eqnarray*} y & = & X_0 \beta* + \epsilon \ X & = & X_0 + W \end{eqnarray*} where $X_0$ is a $f \times m$ design matrix with independent subgaussian row vectors, $\epsilon \in \mathbb{R}f$ is a noise vector and $W$ is a mean zero $f \times m$ random noise matrix with independent subgaussian column vectors, independent of $X_0$ and $\epsilon$. This model is significantly different from those analyzed in the literature in the sense that we allow the measurement error for each covariate to be a dependent vector across its $f$ observations. Such error structures appear in the science literature when modeling the trial-to-trial fluctuations in response strength shared across a set of neurons. Under sparsity and restrictive eigenvalue type of conditions, we show that one is able to recover a sparse vector $\beta* \in \mathbb{R}m$ from the model given a single observation matrix $X$ and the response vector $y$. We establish consistency in estimating $\beta*$ and obtain the rates of convergence in the $\ell_q$ norm, where $q = 1, 2$ for the Lasso-type estimator, and for $q \in [1, 2]$ for a Dantzig-type conic programming estimator. We show error bounds which approach that of the regular Lasso and the Dantzig selector in case the errors in $W$ are tending to 0.

Citations (7)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.