Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Automatic Change-Point Detection in Time Series via Deep Learning (2211.03860v3)

Published 7 Nov 2022 in stat.ML, cs.LG, and stat.ME

Abstract: Detecting change-points in data is challenging because of the range of possible types of change and types of behaviour of data when there is no change. Statistically efficient methods for detecting a change will depend on both of these features, and it can be difficult for a practitioner to develop an appropriate detection method for their application of interest. We show how to automatically generate new offline detection methods based on training a neural network. Our approach is motivated by many existing tests for the presence of a change-point being representable by a simple neural network, and thus a neural network trained with sufficient data should have performance at least as good as these methods. We present theory that quantifies the error rate for such an approach, and how it depends on the amount of training data. Empirical results show that, even with limited training data, its performance is competitive with the standard CUSUM-based classifier for detecting a change in mean when the noise is independent and Gaussian, and can substantially outperform it in the presence of auto-correlated or heavy-tailed noise. Our method also shows strong results in detecting and localising changes in activity based on accelerometer data.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (56)
  1. Ahmadzadeh, F. (2018). Change point detection with multivariate control charts by artificial neural network. J. Adv. Manuf. Technol. 97(9), 3179–3190.
  2. Aminikhanghahi, S. and D. J. Cook (2017). Using change point detection to automate daily activity segmentation. In 2017 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops), pp.  262–267.
  3. Narrowest-over-threshold detection of multiple change points and change-point-like features. J. Roy. Stat. Soc., Ser. B 81(3), 649–672.
  4. Nearly-tight VC-dimension and pseudodimension bounds for piecewise linear neural networks. J. Mach. Learn. Res. 20(63), 1–17.
  5. Beaumont, M. A. (2019). Approximate Bayesian computation. Annu. Rev. Stat. Appl. 6, 379–403.
  6. Learning long-term dependencies with gradient descent is difficult. IEEE T. Neural Networ. 5(2), 157–166.
  7. Convergence rates of deep ReLU networks for multiclass classification. Electron. J. Stat. 16(1), 2724–2773.
  8. Breiman, L. (2001). Random forests. Mach. Learn. 45(1), 5–32.
  9. Kernel change-point detection with auxiliary deep generative models. In International Conference on Learning Representations.
  10. Chen, J. and A. K. Gupta (2012). Parametric Statistical Change Point Analysis: With Applications to Genetics, Medicine, and Finance (2nd ed.). New York: Birkhäuser.
  11. XGBoost: A scalable tree boosting system. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp.  785–794.
  12. Change point detection in time series data using autoencoders with a time-invariant representation. IEEE T. Signal Proces. 69, 3513–3524.
  13. Change-point detection under dependence based on two-sample U-statistics. In D. Dawson, R. Kulik, M. Ould Haye, B. Szyszkowicz, and Y. Zhao (Eds.), Asymptotic Laws and Methods in Stochastics: A Volume in Honour of Miklós Csörgő, pp.  195–220. New York, NY: Springer New York.
  14. robts: Robust Time Series Analysis. R package version 0.3.0/r251.
  15. A MOSUM procedure for the estimation of multiple random change points. Bernoulli 24(1), 526–564.
  16. Detecting changes in slope with an l0subscript𝑙0l_{0}italic_l start_POSTSUBSCRIPT 0 end_POSTSUBSCRIPT penalty. J. Comput. Graph. Stat. 28(2), 265–275.
  17. Relating and comparing methods for detecting changes in mean. Stat 9(1), 1–11.
  18. Fryzlewicz, P. (2014). Wild binary segmentation for multiple change-point detection. Ann. Stat. 42(6), 2243–2281.
  19. Fryzlewicz, P. (2021). Robust narrowest significance pursuit: Inference for multiple change-points in the median. arXiv preprint, arxiv:2109.02487.
  20. Fryzlewicz, P. (2023). Narrowest significance pursuit: Inference for multiple change-points in linear models. J. Am. Stat. Assoc., to appear.
  21. Variance change point detection under a smoothly-changing mean trend with application to liver procurement. J. Am. Stat. Assoc. 114(526), 773–781.
  22. Understanding the difficulty of training deep feedforward neural networks. In Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, pp.  249–256. JMLR Workshop and Conference Proceedings.
  23. Indirect inference. J. Appl. Econom. 8(S1), S85–S118.
  24. Real-time change-point detection: A deep neural network-based adaptive approach for detecting changes in multivariate time series data. Expert Syst. Appl. 209, 1–16.
  25. Likelihood-free inference via classification. Stat. Comput. 28(2), 411–425.
  26. Computationally efficient changepoint detection for a range of penalties. J. Comput. Graph. Stat. 26(1), 134–143.
  27. Convolutional neural networks at constrained time cost. In Proceedings of the IEEE conference on computer vision and pattern recognition, pp.  5353–5360.
  28. Deep residual learning for image recognition. In 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp.  770–778.
  29. PeakSeg: constrained optimal segmentation and supervised penalty learning for peak detection in count data. In International Conference on Machine Learning, pp.  324–332. PMLR.
  30. Change point detection via synthetic signals. In 8th Workshop on Advanced Analytics and Learning on Temporal Data.
  31. Batch normalization: Accelerating deep network training by reducing internal covariate shift. In Proceedings of the 32nd International Conference on International Conference on Machine Learning - Volume 37, ICML’15, pp.  448–456. JMLR.org.
  32. Tests for a change-point. Biometrika 74(1), 71–83.
  33. Inference for single and multiple change-points in time series. J. Time Ser. Anal. 34(4), 423–446.
  34. LightGBM: A highly efficient gradient boosting decision tree. Adv. Neur. In. 30, 3146–3154.
  35. Optimal detection of changepoints with a linear computational cost. J. Am. Stat. Assoc. 107(500), 1590–1598.
  36. Adam: A method for stochastic optimization. In Y. Bengio and Y. LeCun (Eds.), ICLR (Poster).
  37. Moving beyond sub-Gaussianity in high-dimensional statistics: Applications in covariance estimation and linear regression. Inf. Inference: A Journal of the IMA 11(4), 1389–1456.
  38. Training neural networks for sequential change-point detection. In IEEE ICASSP 2023, pp.  1–5. IEEE.
  39. Variance change-point detection in panel data models. Econ. Lett. 126, 140–143.
  40. Automatic change-point detection in time series via deep learning. submitted, arxiv:2211.03860.
  41. Robust mean change point testing in high-dimensional data with heavy tails. arXiv preprint, arxiv:2305.18987.
  42. Increased peak detection accuracy in over-dispersed ChIP-seq data with supervised segmentation models. BMC Bioinform. 22(1), 1–18.
  43. Random forests for change point detection. arXiv preprint, arxiv:2205.04997.
  44. Foundations of Machine Learning. Adaptive Computation and Machine Learning Series. Cambridge, MA: MIT Press.
  45. Ng, A. Y. (2004). Feature selection, l1 vs. l2 regularization, and rotational invariance. In Proceedings of the Twenty-First International Conference on Machine Learning, ICML ’04, New York, NY, USA, pp.  78. Association for Computing Machinery.
  46. Variance change point detection via artificial neural networks for data separation. Neurocomputing 68, 239–250.
  47. Foundation Models for Natural Language Processing: Pre-trained Language Models Integrating Media. Artificial Intelligence: Foundations, Theory, and Algorithms. Springer International Publishing.
  48. A statistical approach for array CGH data analysis. BMC Bioinform. 6(1).
  49. A review and comparison of changepoint detection techniques for climate data. J. Appl. Meteorol. Clim. 46(6), 900–915.
  50. Ripley, B. D. (1994). Neural networks and related methods for classification. J. Roy. Stat. Soc., Ser. B 56(3), 409–456.
  51. Schmidt-Hieber, J. (2020). Nonparametric regression using deep neural networks with ReLU activation function. Ann. Stat. 48(4), 1875–1897.
  52. Understanding Machine Learning: From Theory to Algorithms. New York, NY, USA: Cambridge University Press.
  53. Selective review of offline change point detection methods. Signal Process. 167, 107299.
  54. Optimal change-point detection and localization. arXiv preprint, arxiv:2010.11470.
  55. Wang, T. and R. J. Samworth (2018). High dimensional change point estimation via sparse projection. J. Roy. Stat. Soc., Ser. B 80(1), 57–83.
  56. Convolutional neural networks: an overview and application in radiology. Insights into Imaging 9(4), 611–629.
Citations (13)

Summary

We haven't generated a summary for this paper yet.