Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Robust Reduced Rank Regression (1509.03938v5)

Published 14 Sep 2015 in math.ST, stat.AP, stat.ME, and stat.TH

Abstract: In high-dimensional multivariate regression problems, enforcing low rank in the coefficient matrix offers effective dimension reduction, which greatly facilitates parameter estimation and model interpretation. However, commonly-used reduced-rank methods are sensitive to data corruption, as the low-rank dependence structure between response variables and predictors is easily distorted by outliers. We propose a robust reduced-rank regression approach for joint modeling and outlier detection. The problem is formulated as a regularized multivariate regression with a sparse mean-shift parametrization, which generalizes and unifies some popular robust multivariate methods. An efficient thresholding-based iterative procedure is developed for optimization. We show that the algorithm is guaranteed to converge, and the coordinatewise minimum point produced is statistically accurate under regularity conditions. Our theoretical investigations focus on nonasymptotic robust analysis, which demonstrates that joint rank reduction and outlier detection leads to improved prediction accuracy. In particular, we show that redescending $\psi$-functions can essentially attain the minimax optimal error rate, and in some less challenging problems convex regularization guarantees the same low error rate. The performance of the proposed method is examined by simulation studies and real data examples.

Summary

We haven't generated a summary for this paper yet.