Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Non trivial optimal sampling rate for estimating a Lipschitz-continuous function in presence of mean-reverting Ornstein-Uhlenbeck noise (2405.10795v1)

Published 17 May 2024 in math.ST, math.PR, stat.ME, and stat.TH

Abstract: We examine a mean-reverting Ornstein-Uhlenbeck process that perturbs an unknown Lipschitz-continuous drift and aim to estimate the drift's value at a predetermined time horizon by sampling the path of the process. Due to the time varying nature of the drift we propose an estimation procedure that involves an online, time-varying optimization scheme implemented using a stochastic gradient ascent algorithm to maximize the log-likelihood of our observations. The objective of the paper is to investigate the optimal sample size/rate for achieving the minimum mean square distance between our estimator and the true value of the drift. In this setting we uncover a trade-off between the correlation of the observations, which increases with the sample size, and the dynamic nature of the unknown drift, which is weakened by increasing the frequency of observation. The mean square error is shown to be non monotonic in the sample size, attaining a global minimum whose precise description depends on the parameters that govern the model. In the static case, i.e. when the unknown drift is constant, our method outperforms the arithmetic mean of the observations in highly correlated regimes, despite the latter being a natural candidate estimator. We then compare our online estimator with the global maximum likelihood estimator.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (19)
  1. How often to sample a continuous-time process in the presence of market microstructure noise. Review of Financial Studies, 18(2):351 – 416, 2005.
  2. Stochastic modeling of electricity and related markets. World Scientific Publishing Co., 2008.
  3. Online learning via simplified maximum likelihood. preprint, 2023.
  4. A novel theoretical framework for exponential smoothing. arXiv:2403.04345, 2024.
  5. Time series: theory and methods. Springer science & business media, 2009.
  6. R. G. Brown. Exponential smoothing for predicting demand. Little, 1956.
  7. R. G. Brown. Statistical forecasting for inventory control. (No Title), 1959.
  8. R. G. Brown. Smoothing, Forecasting and Prediction of Discrete Time Series. Englewood Cliffs, 1963.
  9. Online optimization in x-armed bandits. In D. Koller, D. Schuurmans, Y. Bengio, and L. Bottou, editors, Advances in Neural Information Processing Systems, volume 21. Curran Associates, Inc., 2008.
  10. On the time-varying distributions of online stochastic optimization. In 2019 American Control Conference (ACC), pages 1494–1500, 2019.
  11. E. Codling and N. Hill. Sampling rate effects on measurements of correlated and biased random walks. Journal of Theoretical Biology, 233(4):573 – 588, 2005.
  12. A. Lanconelli and C. S. A. Lauria. Maximum likelihood with a time varying parameter. Statistical Papers, 2023.
  13. B. T. Polyak. Introduction to Optimization (Translations Series in Mathematics and Engineering). Optimization Software, 1987.
  14. A. Y. Popkov. Gradient methods for nonstationary unconstrained optimization problems. Automation and Remote Control, 66(6):883–891, Jun 2005.
  15. The effect of sampling rate on observed statistics in a correlated random walk. Journal of the Royal Society Interface, 10(85), 2013.
  16. Time-varying convex optimization: Time-structured algorithms and applications. Proceedings of the IEEE, 108(11):2032–2048, 2020.
  17. Z. Zhu and M. S. Taqqu. Impact of the sampling rate on the estimation of the parameters of fractional brownian motion. Journal of Time Series Analysis, 27(3):367 – 380, 2006.
  18. M. Zinkevich. Online convex programming and generalized infinitesimal gradient ascent. In Proceedings of the Twentieth International Conference on International Conference on Machine Learning, ICML’03, page 928–935. AAAI Press, 2003.
  19. K. Åström. On the choice of sampling rates in parametric identification of time series. Information Sciences, 1(3):273 – 278, 1969.

Summary

We haven't generated a summary for this paper yet.