Autocovariance estimation in regression with a discontinuous signal and $m$-dependent errors: A difference-based approach
Abstract: We discuss a class of difference-based estimators for the autocovariance in nonparametric regression when the signal is discontinuous (change-point regression), possibly highly fluctuating, and the errors form a stationary $m$-dependent process. These estimators circumvent the explicit pre-estimation of the unknown regression function, a task which is particularly challenging for such signals. We provide explicit expressions for their mean squared errors when the signal function is piecewise constant (segment regression) and the errors are Gaussian. Based on this we derive biased-optimized estimates which do not depend on the particular (unknown) autocovariance structure. Notably, for positively correlated errors, that part of the variance of our estimators which depends on the signal is minimal as well. Further, we provide sufficient conditions for $\sqrt{n}$-consistency; this result is extended to piecewise Holder regression with non-Gaussian errors. We combine our biased-optimized autocovariance estimates with a projection-based approach and derive covariance matrix estimates, a method which is of independent interest. Several simulation studies as well as an application to biophysical measurements complement this paper.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.