Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 66 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 21 tok/s Pro
GPT-5 High 30 tok/s Pro
GPT-4o 91 tok/s Pro
Kimi K2 202 tok/s Pro
GPT OSS 120B 468 tok/s Pro
Claude Sonnet 4.5 35 tok/s Pro
2000 character limit reached

Estimating the Hawkes process from a discretely observed sample path (2401.11075v1)

Published 20 Jan 2024 in stat.ME and stat.CO

Abstract: The Hawkes process is a widely used model in many areas, such as finance, seismology, neuroscience, epidemiology, and social sciences. Estimation of the Hawkes process from continuous observations of a sample path is relatively straightforward using either the maximum likelihood or other methods. However, estimating the parameters of a Hawkes process from observations of a sample path at discrete time points only is challenging due to the intractability of the likelihood with such data. In this work, we introduce a method to estimate the Hawkes process from a discretely observed sample path. The method takes advantage of a state-space representation of the incomplete data problem and use the sequential Monte Carlo (aka particle filtering) to approximate the likelihood function. As an estimator of the likelihood function the SMC approximation is unbiased, and therefore it can be used together with the Metropolis-Hastings algorithm to construct Markov Chains to approximate the likelihood distribution, or more generally, the posterior distribution of model parameters. The performance of the methodology is assessed using simulation experiments and compared with other recently published methods. The proposed estimator is found to have a smaller mean square error than the two benchmark estimators. The proposed method has the additional advantage that confidence intervals for the parameters are easily available. We apply the proposed estimator to the analysis of weekly count data on measles cases in Tokyo Japan and compare the results to those by one of the benchmark methods.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (30)
  1. Particle markov chain monte carlo methods. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 72(3):269–342.
  2. The pseudo-marginal approach for efficient Monte Carlo computations. The Annals of Statistics, 37(2):697 – 725.
  3. Julia: A fresh approach to numerical computing. SIAM review, 59(1):65–98.
  4. Chen, F. (2022). IHSEP: Inhomogeneous Self-Exciting Process. R package version 0.3.1. URL: https://cran.r-project.org/package=IHSEP.
  5. Cheysson, F. (2021). hawkesbow: Estimation of Hawkes Processes from Binned Observations. R package version 1.0.2.
  6. Spectral estimation of Hawkes processes from count data. The Annals of Statistics, 50(3):1722 – 1746.
  7. Maximum likelihood identification of neural point process systems. Biological Cybernetics, 59(4):265–275.
  8. An Introduction to the Theory of Point Processes Volume I: Elementary Theory and Methods. Springer-Verlag, New York, 2nd edition.
  9. The correlated pseudomarginal method. Journal of the Royal Statistical Society. Series B (Statistical Methodology), 80(5):pp. 839–870.
  10. Efficient implementation of Markov chain Monte Carlo when using an unbiased likelihood estimator. Biometrika, 102(2):295–313.
  11. Weak convergence and optimal scaling of random walk Metropolis algorithms. The Annals of Applied Probability, 7(1):110 – 120.
  12. Novel approach to nonlinear/non-Gaussian Bayesian state estimation. IEE Proceedings F - Radar and Signal Processing, 140:107–113(6).
  13. Hastings, W. K. (1970). Monte Carlo sampling methods using Markov chains and their applications. Biometrika, 57(1):97–109.
  14. Hawkes, A. G. (1971). Spectra of some self-exciting and mutually exciting point processes. Biometrika, 58(1):83–90.
  15. Kallenberg, O. (2021). Foundations of Modern Probability. Springer Nature, Switzerland, 3rd edition.
  16. Kitagawa, G. (1996). Monte carlo filter and smoother for non-gaussian nonlinear state space models. Journal of Computational and Graphical Statistics, 5(1):1–25.
  17. Equation of state calculations by fast computing machines. The Journal of Chemical Physics, 21(6):1087–1092.
  18. Unbiased Markov chain Monte Carlo for intractable target distributions. Electronic Journal of Statistics, 14(2):2842 – 2891.
  19. Assessing the transmission dynamics of measles in japan, 2016. Epidemics, 20:67–72.
  20. Ogata, Y. (1978). The asymptotic behaviour of maximum likelihood estimators for stationary point processes. Annals of the Institute of Statistical Mathematics, 30:243–261.
  21. Ozaki, T. (1979). Maximum likelihood estimation of Hawkes’ self-exciting point processes. Annals of the Institute of Statistical Mathematics, 31(1):145–155.
  22. On some properties of markov chain monte carlo simulation methods based on the particle filter. Journal of Econometrics, 171(2):134–151. Bayesian Models, Methods and Applications.
  23. Filtering via simulation: Auxiliary particle filters. Journal of the American Statistical Association, 94(446):590–599.
  24. R Core Team (2022). R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria.
  25. Interval-censored hawkes processes. Journal of Machine Learning Research, 23(338):1–84.
  26. Shcherbinin, A. F. (1987). The normalized likelihood method. Measurement Techniques, 30(12):1129–1134.
  27. A parameter estimation method for multivariate binned Hawkes processes. Statistics and Computing, 32(6):98.
  28. Parameter estimation of binned Hawkes processes. Journal of Computational and Graphical Statistics, 31(4):990–1000.
  29. van der Vaart, A. (2007). Asymptotic Statistics. Cambridge university press, New York.
  30. Confidence distribution, the frequentist distribution estimator of a parameter: A review. International Statistical Review, 81(1):3–39.

Summary

We haven't generated a summary for this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 post and received 51 likes.