2000 character limit reached
An \ell_1-oracle inequality for the Lasso in finite mixture of multivariate Gaussian regression models
Published 17 Oct 2014 in math.ST and stat.TH | (1410.4682v1)
Abstract: We consider a multivariate finite mixture of Gaussian regression models for high-dimensional data, where the number of covariates and the size of the response may be much larger than the sample size. We provide an $\ell_1$-oracle inequality satisfied by the Lasso estimator according to the Kullback-Leibler loss. This result is an extension of the $\ell_1$-oracle inequality established by Meynet in \cite{Meynet} in the multivariate case. We focus on the Lasso for its $\ell_1$-regularization properties rather than for the variable selection procedure, as it was done in St\"adler in \cite{Stadler}.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.