On the Predictive Accuracy of Neural Temporal Point Process Models for Continuous-time Event Data (2306.17066v2)
Abstract: Temporal Point Processes (TPPs) serve as the standard mathematical framework for modeling asynchronous event sequences in continuous time. However, classical TPP models are often constrained by strong assumptions, limiting their ability to capture complex real-world event dynamics. To overcome this limitation, researchers have proposed Neural TPPs, which leverage neural network parametrizations to offer more flexible and efficient modeling. While recent studies demonstrate the effectiveness of Neural TPPs, they often lack a unified setup, relying on different baselines, datasets, and experimental configurations. This makes it challenging to identify the key factors driving improvements in predictive accuracy, hindering research progress. To bridge this gap, we present a comprehensive large-scale experimental study that systematically evaluates the predictive accuracy of state-of-the-art neural TPP models. Our study encompasses multiple real-world and synthetic event sequence datasets, following a carefully designed unified setup. We thoroughly investigate the influence of major architectural components such as event encoding, history encoder, and decoder parametrization on both time and mark prediction tasks. Additionally, we delve into the less explored area of probabilistic calibration for neural TPP models. By analyzing our results, we draw insightful conclusions regarding the significance of history size and the impact of architectural components on predictive accuracy. Furthermore, we shed light on the miscalibration of mark distributions in neural TPP models. Our study aims to provide valuable insights into the performance and characteristics of neural TPP models, contributing to a better understanding of their strengths and limitations.
- Handbook of mathematical functions: With formulas, graphs, and mathematical tables, 1965. Courier Corporation.
- tick: a python library for statistical learning, with an emphasis on hawkes processes and time-dependent models, 2018. 18:1-5, Journal of Machine Learning Research.
- Sparse and low-rank multivariate hawkes processes, 2020. 21:1817–1848, Journal of Machine Learning Research.
- Souhaib Ben Taieb. Learning quantile functions for temporal point processes with recurrent neural splines, 2022. AISTATS.
- User-dependent neural sequence models for continuous-time event data, 2020. Neurips.
- Modeling sequential online interactive behaviors with temporal point process, 2018. CIKM.
- The multivariate hawkes process in high dimensions: Beyond mutual excitation, 2019.
- Renewal in hawkes processes with self-excitation and inhibition. 2020. 52(3):879–915, Advances in Applied Probability.
- An introduction to the theory of point processes volume ii: General theory and structure, 2007.
- A. P. Dawid. Present position and potential developments: Some personal views: Statistical theory: The prequential approach., 1984. Journal of the Royal Statistical Society. Series A, 147:278-292.
- Temporal point processes, 2019. Notes for Human-Centered ML, Saarland University.
- Janez Demšar. Statistical comparisons of classifiers over multiple data sets., 2006. Journal of Machine Learning Research, 7:1-30.
- A large-scale study of probabilistic calibration in neural network regression, 2023. ICML.
- Recurrent marked temporal point processes: Embedding event history to vector, 2016. SIGKDD.
- Interacting hawkes processes with multiplicative inhibition, 2022. 148:180-226, Stochastic Processes and their Applications.
- SIAM Undergraduate Research Online.
- Neural temporal point processes for modelling electronic health records, 2020. ML4H.
- Coevolve: A joint point process model for information diffusion and network co-evolution, 2015. Neurips.
- Milton Friedman. The use of ranks to avoid the assumption of normality implicit in the analysis of variance., 1937. Journal of the American Statistical Association, 32:675–701.
- Milton Friedman. A comparison of alternative tests of significance for the prob lem of m rankings., 1940. Annals of Mathematical Statistics, 11:86–92.
- An extension on “statistical comparisons of classifiers over multiple data sets” for all pairwise comparisons, 2008. Journal of Machine Learning Research, 9:2677-2694.
- Probabilistic forecasts, calibration and sharpness, 2007. Journal of the Royal Statistical Society: Series B (Statistical Methodology).
- On calibration of modern neural networks, 2017. ICML.
- Initiator: Noise-contrastive estimation for marked temporal point process, 2018. IJCAI.
- Alan G Hawkes. Point spectra of some mutually exciting point processes, 1971. Journal of the Royal Statistical Society: Series B, 33(3).
- Alan G. Hawkes. Hawkes processes and their applications to finance: a review, 2018. Quantitative Finance, 18:2, 193-198.
- Sture Holm. A simple sequentially rejective multiple test procedure., 1979. Scandinavian Journal of Statistics, 6:65–70.
- A self-correcting point process., 1979. Stochastic Processes and Their Applications, 8(3):335–347.
- Time2vec: Learning a vector representation of time, 2019.
- Adam: A method for stochastic optimization, 2014. ICLR.
- Statistical size distributions in economics and actuarial sciences, 2003. Wiley, p. 179.
- Accurate uncertainties for deep learning using calibrated regression, 2018. ICML.
- Predicting dynamic embedding trajectory in temporal interaction networks. In Proceedings of the 25th ACM SIGKDD international conference on Knowledge discovery and data mining, 2019. KDD.
- An empirical study: Extensive deep temporal point process, 2021.
- Exploring generative neural temporal point process, 2022. TMLR.
- The trend-renewal process for statistical analysis of repairable systems., 2003. Technometrics 45, no. 1 (2003): 31–44.
- Thomas Josef Liniger. Multivariate hawkes processes., 2009. Diss., Eidgenossische Technische Hochschule ETH Zurich.
- The neural hawkes process: A neurally self-modulating multivariate point process, 2016. Neurips.
- Noise-contrastive estimation for multivariate point processes, 2020. Neurips.
- Obtaining well calibrated probabilities using bayesian binning, 2015. AAAI.
- Yosihiko Ogata. Space-time point-process models for earthquake occurrences, 1998. Annals of the Institute of Statistical Mathematics, 50(2):379–402.
- Fully neural network based model for general temporal point processes. 2019. Neurips.
- Jakob Gulddahl Rasmussen. Lecture notes: Temporal point processes and the conditional intensity function, 2018.
- Sir-hawkes: Linking epidemic models and hawkes processes to model diffusions in finite populations, 2018. Proceedings of the 2018 World Wide Web Conference on World Wide Web.
- Intensity-free learning of temporal point processes, 2019. ICLR.
- Fast and flexible temporal point processes with triangular maps, 2020. Neurips.
- Neural temporal point processes: A review, 2021. Proceedings of 13th Joint Conference on Artificial Intelligence.
- Optimization for machine learning, 2011. MIT Press.
- Representation learning over dynamic graphs, 2018. ICLR.
- Deep reinforcement learning of marked temporal point processes, 2018. Neurips.
- Attention is all you need, 2017. Neurips.
- Modeling inter-dependence between time and mark in multivariate temporal point processes, 2022. CIKM.
- Isotonic hawkes processes, 2016. 33rd International Conference on Machine Learning.
- Andreas Wienke. Frailty models in survival analysis, 2010. Chapman and Hall/CRC.
- Wasserstein learning of deep generative point process models, 2017a. Neurips.
- Modeling the intensity function of point process via recurrent neural networks, 2017b. AAAI.
- Learning conditional generative models for temporal point processes, 2018. AAAI.
- Junchi Yan. Recent advance in temporal point process : from machine learning perspective, 2019.
- Self-attentive hawkes processes, 2020. ICML.
- Transformer hawkes process, 2020. ICML.
- Tanguy Bosser (3 papers)
- Souhaib Ben Taieb (18 papers)