Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

An \ell_1-oracle inequality for the Lasso in finite mixture of multivariate Gaussian regression models (1410.4682v1)

Published 17 Oct 2014 in math.ST and stat.TH

Abstract: We consider a multivariate finite mixture of Gaussian regression models for high-dimensional data, where the number of covariates and the size of the response may be much larger than the sample size. We provide an $\ell_1$-oracle inequality satisfied by the Lasso estimator according to the Kullback-Leibler loss. This result is an extension of the $\ell_1$-oracle inequality established by Meynet in \cite{Meynet} in the multivariate case. We focus on the Lasso for its $\ell_1$-regularization properties rather than for the variable selection procedure, as it was done in St\"adler in \cite{Stadler}.

Summary

We haven't generated a summary for this paper yet.