Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Minimax Optimal rates of convergence in the shuffled regression, unlinked regression, and deconvolution under vanishing noise (2404.09306v1)

Published 14 Apr 2024 in math.ST and stat.TH

Abstract: Shuffled regression and unlinked regression represent intriguing challenges that have garnered considerable attention in many fields, including but not limited to ecological regression, multi-target tracking problems, image denoising, etc. However, a notable gap exists in the existing literature, particularly in vanishing noise, i.e., how the rate of estimation of the underlying signal scales with the error variance. This paper aims to bridge this gap by delving into the monotone function estimation problem under vanishing noise variance, i.e., we allow the error variance to go to $0$ as the number of observations increases. Our investigation reveals that, asymptotically, the shuffled regression problem exhibits a comparatively simpler nature than the unlinked regression; if the error variance is smaller than a threshold, then the minimax risk of the shuffled regression is smaller than that of the unlinked regression. On the other hand, the minimax estimation error is of the same order in the two problems if the noise level is larger than that threshold. Our analysis is quite general in that we do not assume any smoothness of the underlying monotone link function. Because these problems are related to deconvolution, we also provide bounds for deconvolution in a similar context. Through this exploration, we contribute to understanding the intricate relationships between these statistical problems and shed light on their behaviors when subjected to the nuanced constraint of vanishing noise.

Summary

We haven't generated a summary for this paper yet.