Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 63 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 32 tok/s Pro
GPT-5 High 29 tok/s Pro
GPT-4o 88 tok/s Pro
Kimi K2 152 tok/s Pro
GPT OSS 120B 325 tok/s Pro
Claude Sonnet 4.5 32 tok/s Pro
2000 character limit reached

Nonparametric estimation of Hawkes processes with RKHSs (2411.00621v2)

Published 1 Nov 2024 in stat.ML, cs.LG, and stat.ME

Abstract: This paper addresses nonparametric estimation of nonlinear multivariate Hawkes processes, where the interaction functions are assumed to lie in a reproducing kernel Hilbert space (RKHS). Motivated by applications in neuroscience, the model allows complex interaction functions, in order to express exciting and inhibiting effects, but also a combination of both (which is particularly interesting to model the refractory period of neurons), and considers in return that conditional intensities are rectified by the ReLU function. The latter feature incurs several methodological challenges, for which workarounds are proposed in this paper. In particular, it is shown that a representer theorem can be obtained for approximated versions of the log-likelihood and the least-squares criteria. Based on it, we propose an estimation method, that relies on two common approximations (of the ReLU function and of the integral operator). We provide a bound that controls the impact of these approximations. Numerical results on synthetic data confirm this fact as well as the good asymptotic behavior of the proposed estimator. It also shows that our method achieves a better performance compared to related nonparametric estimation techniques and suits neuronal applications.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (77)
  1. Tractable nonparametric Bayesian inference in Poisson processes with Gaussian process intensities. In Proceedings of the 26th Annual International Conference on Machine Learning, ICML ’09, pages 9–16, New York, NY, USA, 2009. Association for Computing Machinery.
  2. E. Bacry and J.-F. Muzy. First- and Second-Order Statistics Characterization of Hawkes Processes and Non-Parametric Estimation. IEEE Transactions on Information Theory, 62(4):2184–2202, 2016.
  3. Non-parametric kernel estimation for symmetric Hawkes processes. Application to high frequency financial data. Eur. Phys. J. B, 85(5):157, 2012.
  4. Sparse and low-rank multivariate Hawkes processes. Journal of Machine Learning Research, 21(50):1–32, 2020.
  5. P. L. Bartlett and S. Mendelson. Rademacher and Gaussian Complexities: Risk Bounds and Structural Results. Journal of Machine Learning Research, 3(Nov):463–482, 2002.
  6. R. W. Berg and S. Ditlevsen. Synaptic inhibition and excitation estimated via the time constant of membrane potential fluctuations. J Neurophysiol, 110(4):1021–1034, 2013.
  7. A. Berlinet and C. Thomas-Agnan. Reproducing Kernel Hilbert Spaces in Probability and Statistics. Springer US, Boston, MA, 2004.
  8. Inference of multivariate exponential Hawkes processes with inhibition and application to neuronal activity. Stat Comput, 33(4):91, 2023.
  9. Random Fourier Features For Operator-Valued Kernels. In Proceedings of The 8th Asian Conference on Machine Learning, pages 110–125. PMLR, 2016.
  10. A flexible, random histogram kernel for discrete-time Hawkes processes, 2022.
  11. P. Brémaud and L. Massoulié. Stability of nonlinear Hawkes processes. The Annals of Probability, 24(3):1563–1588, 1996.
  12. O. Chapelle. Training a Support Vector Machine in the Primal. Neural Computation, 19(5):1155–1178, 2007.
  13. F. Chen and P. Hall. Nonparametric Estimation for Self-Exciting Point Processes—A Parsimonious Approach. Journal of Computational and Graphical Statistics, 25(1):209–224, 2016.
  14. Renewal in Hawkes processes with self-excitation and inhibition. Advances in Applied Probability, 52(3):879–915, 2020.
  15. Fast Gaussian process methods for point process intensity estimation. In Proceedings of the 25th International Conference on Machine Learning, ICML ’08, pages 192–199, New York, NY, USA, 2008. Association for Computing Machinery.
  16. D. J. Daley and D. Vere-Jones. An Introduction to the Theory of Point Processes: Elementary Theory and Methods. Springer-Verlag, New York, 2003.
  17. I. Deutsch and G. J. Ross. Estimating Product Cannibalisation in Wholesale using Multivariate Hawkes Processes with Inhibition, 2022.
  18. Nonparametric Bayesian estimation for multivariate Hawkes processes. The Annals of Statistics, 48(5):2698–2727, 2020.
  19. Recurrent Marked Temporal Point Processes: Embedding Event History to Vector. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD ’16, pages 1555–1564, New York, NY, USA, 2016. Association for Computing Machinery.
  20. Interacting Hawkes processes with multiplicative inhibition. Stochastic Processes and their Applications, 148:180–226, 2022.
  21. Graphical Modeling for Multivariate Hawkes Processes with Nonparametric Link Functions. Journal of Time Series Analysis, 38(2):225–242, 2017.
  22. Poisson intensity estimation with reproducing kernels. In Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, pages 270–279. PMLR, 2017a.
  23. Poisson intensity estimation with reproducing kernels. Electronic Journal of Statistics, 11(2):5081–5104, 2017b.
  24. Lasso and probabilistic inequalities for multivariate point processes. Bernoulli, 21(1):83–143, 2015.
  25. A. G. Hawkes. Spectra of Some Self-Exciting and Mutually Exciting Point Processes. Biometrika, 58(1):83–90, 1971.
  26. A. G. Hawkes and D. Oakes. A Cluster Process Representation of a Self-Exciting Process. Journal of Applied Probability, 11(3):493–503, 1974.
  27. S. Joseph and S. Jain. A neural network based model for multi-dimensional non-linear Hawkes processes. Journal of Computational and Applied Mathematics, 447:115889, 2024a.
  28. S. Joseph and S. Jain. Non-Parametric Estimation of Multi-dimensional Marked Hawkes Processes, 2024b.
  29. Shallow Neural Hawkes: Non-parametric kernel estimation for Hawkes processes. Journal of Computational Science, 63:101754, 2022.
  30. M. Kirchner. An estimation procedure for the Hawkes process. Quantitative Finance, 17(4):571–595, 2017.
  31. M. Kirchner and A. Bercher. A nonparametric estimation procedure for the Hawkes process: Comparison with maximum likelihood estimation. Journal of Statistical Computation and Simulation, 88(6):1106–1116, 2018.
  32. D. Kurisu. Discretization of Self-Exciting Peaks Over Threshold Models, 2017.
  33. R. Lemonnier and N. Vayatis. Nonparametric Markovian Learning of Triggering Kernels for Mutually Exciting and Mutually Inhibiting Multivariate Hawkes Processes. In T. Calders, F. Esposito, E. Hüllermeier, and R. Meo, editors, Machine Learning and Knowledge Discovery in Databases, pages 161–176, Berlin, Heidelberg, 2014. Springer.
  34. E. Lewis and G. Mohler. A Nonparametric EM Algorithm for Multiscale Hawkes Processes. Journal of Nonparametric Statistics, 2011.
  35. C. Li and K. Cui. Multivariate Hawkes processes with spatial covariates for spatiotemporal event data analysis. Ann Inst Stat Math, 76(4):535–578, 2024.
  36. A multitask point process predictive model. In Proceedings of the 32nd International Conference on International Conference on Machine Learning - Volume 37, ICML’15, pages 2030–2038, Lille, France, 2015. JMLR.org.
  37. S. Linderman and R. Adams. Discovering Latent Network Structure in Point Process Data. In Proceedings of the 31st International Conference on Machine Learning, pages 1413–1421. PMLR, 2014.
  38. Scalable Bayesian Inference for Excitatory Point Process Networks, 2015.
  39. Variational Inference for Gaussian Process Modulated Poisson Processes. In Proceedings of the 32nd International Conference on Machine Learning, pages 1814–1822. PMLR, 2015.
  40. Chapter 10 - Nerve Conduction and Neuromuscular Transmission. In J. A. Downey, S. J. Myers, E. G. Gonzalez, and J. S. Lieberman, editors, The Physiological Basis of Rehabilitation Medicine (Second Edition), pages 215–242. Butterworth-Heinemann, 1994.
  41. Learning with smooth Hinge losses. Neurocomputing, 463:379–387, 2021.
  42. D. Marsan and O. Lengliné. Extending earthquakes’ reach through cascading. Science, 319(5866):1076–1079, 2008.
  43. H. Mei and J. Eisner. The neural hawkes process: A neurally self-modulating multivariate point process. In Proceedings of the 31st International Conference on Neural Information Processing Systems, NIPS’17, pages 6757–6767, Red Hook, NY, USA, 2017. Curran Associates Inc.
  44. Interpretable Transformer Hawkes Processes: Unveiling Complex Interactions in Social Networks. In Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, KDD ’24, pages 2200–2211, New York, NY, USA, 2024. Association for Computing Machinery.
  45. On Learning Vector-Valued Functions. Neural Comput., 17(1):177–204, 2005.
  46. Self-Exciting Point Process Modeling of Crime. Journal of the American Statistical Association, 106(493):100–108, 2011.
  47. Log Gaussian Cox Processes. Scandinavian Journal of Statistics, 25(3):451–482, 1998.
  48. Y. Ogata. On Lewis’ simulation method for point processes. IEEE Transactions on Information Theory, 27(1):23–31, 1981.
  49. Y. Ogata. Statistical Models for Earthquake Occurrences and Residual Analysis for Point Processes. Journal of the American Statistical Association, 83(401):9–27, 1988.
  50. J. Olinde and M. B. Short. A Self-limiting Hawkes Process: Interpretation, Estimation, and Use in Crime Modeling. In 2020 IEEE International Conference on Big Data (Big Data), pages 3212–3219, 2020.
  51. T. Ozaki. Maximum likelihood estimation of Hawkes’ self-exciting point processes. Ann Inst Stat Math, 31(1):145–155, 1979.
  52. Self-Adaptable Point Processes with Nonparametric Time Decays. In Advances in Neural Information Processing Systems, volume 34, pages 4594–4606. Curran Associates, Inc., 2021.
  53. V. I. Paulsen and M. Raghupathi. An Introduction to the Theory of Reproducing Kernel Hilbert Spaces. Cambridge Studies in Advanced Mathematics. Cambridge University Press, Cambridge, 2016.
  54. Lognormal firing rate distribution reveals prominent fluctuation-driven regime in spinal motor networks. Elife, 5:e18805, 2016.
  55. Decoupling of timescales reveals sparse convergent CPG network in the adult spinal cord. Nat Commun, 10(1):2937, 2019.
  56. P. Reynaud-Bouret and S. Schbath. Adaptive estimation for Hawkes processes; application to genome analysis. The Annals of Statistics, 38(5):2781–2822, 2010.
  57. Inference of functional connectivity in Neurosciences via Hawkes processes. In 2013 IEEE Global Conference on Signal and Information Processing, pages 317–320, 2013.
  58. Goodness-of-Fit Tests and Nonparametric Adaptive Estimation for Spike Train Analysis. J Math Neurosci, 4(1):3, 2014.
  59. Y.-L. K. Samo and S. Roberts. Scalable nonparametric Bayesian inference on point processes with Gaussian processes. In Proceedings of the 32nd International Conference on International Conference on Machine Learning - Volume 37, ICML’15, pages 2227–2236, Lille, France, 2015. JMLR.org.
  60. Intensity-Free Learning of Temporal Point Processes. In International Conference on Learning Representations, 2019.
  61. Flexible Parametric Inference for Space-Time Hawkes Processes, 2024.
  62. FaDIn: Fast discretized inference for Hawkes processes with general parametric kernels. In Proceedings of the 40th International Conference on Machine Learning, volume 202 of ICML’23, pages 32575–32597, Honolulu, Hawaii, 2023. PMLR.
  63. Scalable and adaptive variational Bayes methods for Hawkes processes, 2023.
  64. Bayesian estimation of nonlinear Hawkes processes. Bernoulli, 30(2):1257–1286, 2024.
  65. SciPy 1.0: Fundamental algorithms for scientific computing in Python. Nat Methods, 17(3):261–272, 2020.
  66. Distributed Inference for Linear Support Vector Machine. Journal of Machine Learning Research, 20(113):1–41, 2019.
  67. Learning Granger Causality from Instance-wise Self-attentive Hawkes Processes. In Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, pages 415–423. PMLR, 2024.
  68. Learning Granger Causality for Hawkes Processes. In Proceedings of The 33rd International Conference on Machine Learning, pages 1717–1726. PMLR, 2016.
  69. Nyström Method vs Random Fourier Features: A Theoretical and Empirical Comparison. In Advances in Neural Information Processing Systems, volume 25. Curran Associates, Inc., 2012.
  70. Online Learning for Multivariate Hawkes Processes. In Advances in Neural Information Processing Systems, volume 30. Curran Associates, Inc., 2017.
  71. Efficient Non-parametric Bayesian Hawkes Processes. pages 4299–4305, 2019.
  72. Variational Inference for Sparse Gaussian Process Modulated Hawkes Process. Proceedings of the AAAI Conference on Artificial Intelligence, 34(04):6803–6810, 2020.
  73. Scalable Inference for Nonparametric Hawkes Process Using P\’{o}lya-Gamma Augmentation, 2019.
  74. Efficient Inference for Nonparametric Hawkes Processes Using Auxiliary Latent Variables. Journal of Machine Learning Research, 21(241):1–31, 2020.
  75. Efficient EM-variational inference for nonparametric Hawkes process. Statistics and Computing, 31(4):46, 2021.
  76. Learning Triggering Kernels for Multi-dimensional Hawkes Processes. In Proceedings of the 30th International Conference on Machine Learning, pages 1301–1309. PMLR, 2013.
  77. Transformer Hawkes process. In Proceedings of the 37th International Conference on Machine Learning, volume 119 of ICML’20, pages 11692–11702. JMLR.org, 2020.

Summary

  • The paper introduces a novel nonparametric method utilizing Reproducing Kernel Hilbert Spaces (RKHSs) to estimate complex nonlinear multivariate Hawkes processes.
  • Numerical experiments demonstrate that the RKHS-based approach effectively estimates nonlinear interactions and outperforms traditional parametric and other nonparametric methods on synthetic and neuronal datasets.
  • This RKHS method provides a flexible tool with significant potential for modeling complex dynamics and understanding interactions in systems like neuronal networks.

Nonparametric Estimation of Nonlinear Multivariate Hawkes Processes with Reproducing Kernel Hilbert Spaces

The paper by Bonnet and Sangnier introduces a method for the nonparametric estimation of nonlinear multivariate Hawkes processes by leveraging the capabilities of Reproducing Kernel Hilbert Spaces (RKHSs). This approach addresses several challenges inherent in modeling complex interaction functions within multivariate point processes. This method is particularly motivated by applications in neuroscience, where phenomena such as neuronal interactions need to be precisely captured.

Summary of Contributions

The paper presents a novel estimation method for Hawkes processes, assuming that these processes' interaction functions reside within an RKHS. The model permits complex interactions that involve exciting and inhibiting effects, which are especially relevant for accurately modeling neuronal activity, including refractory periods. To manage methodological challenges posed by these processes, the authors propose approximations for the ReLU-based conditional intensities within the likelihood and least-squares estimation frameworks.

Key Theoretical Insights

The authors establish a representer theorem specific to these RKHS-based estimations, which forms the core theoretical contribution. They demonstrate the representational framework necessary to support these complex estimations. Specifically, they reveal that both the approximated likelihood and least-squares criteria can be addressed in a structured manner by proposing compact parameterizations of the interaction functions, enhancing computational tractability.

Numerical Experiments and Results

The method is experimentally validated using synthetic data and real neuronal datasets. The results illustrate that this RKHS-based method outperforms traditional parametric models and related nonparametric alternatives, such as Bernstein polynomial approximations and Gaussian basis function models. Notably, the RKHS method is effective in estimating nonlinear interactions due to its adaptability in capturing changes in interaction signs over time—a critical feature for neuronal data with both exciting and inhibiting interactions.

Implications and Future Directions

One prominent implication of this paper lies in its application in neuroscience, particularly for modeling and understanding complex neuronal interactions from neural spike train data. The proposed approach offers methodological flexibility that traditional linear or fixed-form models lack. Future work could consider extending this framework to spatiotemporal Hawkes processes and incorporating efficient estimation techniques, such as kernel approximation methods, to further reduce computational demands.

Additionally, adapting this method to work within the sparse learning framework or with operator-valued kernels could open new pathways for compressing data while preserving crucial structural properties of neuronal networks. Given the methodological novelty and the strong numerical performance on synthetic and real data, this approach provides a compelling foundation for future work seeking to understand the dynamics inherent in complex multivariate point processes, particularly in the context of biological systems.

Conclusion

Bonnet and Sangnier's work provides a substantial step forward in nonparametric estimation strategies for Hawkes processes, revealing the potency of RKHS-driven methods. The balance between methodological soundness and practical applicability demonstrated in this paper positions it as a useful tool for handling complex, interaction-rich process data across various fields, particularly those involving dynamical systems such as neuroscience and beyond.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 2 posts and received 7 likes.

Don't miss out on important new AI/ML research

See which papers are being discussed right now on X, Reddit, and more:

“Emergent Mind helps me see which AI papers have caught fire online.”

Philip

Philip

Creator, AI Explained on YouTube