Cumulative Hazard Function Based Efficient Multivariate Temporal Point Process Learning (2404.13663v2)
Abstract: Most existing temporal point process models are characterized by conditional intensity function. These models often require numerical approximation methods for likelihood evaluation, which potentially hurts their performance. By directly modelling the integral of the intensity function, i.e., the cumulative hazard function (CHF), the likelihood can be evaluated accurately, making it a promising approach. However, existing CHF-based methods are not well-defined, i.e., the mathematical constraints of CHF are not completely satisfied, leading to untrustworthy results. For multivariate temporal point process, most existing methods model intensity (or density, etc.) functions for each variate, limiting the scalability. In this paper, we explore using neural networks to model a flexible but well-defined CHF and learning the multivariate temporal point process with low parameter complexity. Experimental results on six datasets show that the proposed model achieves the state-of-the-art performance on data fitting and event prediction tasks while having significantly fewer parameters and memory usage than the strong competitors. The source code and data can be obtained from https://github.com/lbq8942/NPP.
- G. Mavroudeas, N. Neehal, X. Shou, M. Magdon-Ismail, J. N. Kuruzovich, and K. P. Bennett, “Predictive modeling for complex care management,” 2021 IEEE International Conference on Bioinformatics and Biomedicine (BIBM), pp. 1443–1448, 2021. [Online]. Available: https://api.semanticscholar.org/CorpusID:245934223
- E. Choi, N. Du, R. Chen, L. Song, and J. Sun, “Constructing disease network and temporal progression model via context-sensitive hawkes process,” 2015 IEEE International Conference on Data Mining, pp. 721–726, 2015. [Online]. Available: https://api.semanticscholar.org/CorpusID:1370873
- J. ge Yao, F. Fan, W. X. Zhao, X. Wan, E. Y. Chang, and J. Xiao, “Tweet timeline generation with determinantal point processes,” in AAAI Conference on Artificial Intelligence, 2016. [Online]. Available: https://api.semanticscholar.org/CorpusID:37498580
- E. Bacry, I. Mastromatteo, and J.-F. Muzy, “Hawkes processes in finance,” Market Microstructure and Liquidity, vol. 1, no. 01, p. 1550005, 2015.
- M. N. M. van Lieshout and A. Stein, “Earthquake modelling at the country level using aggregated spatio-temporal point processes,” Mathematical Geosciences, vol. 44, pp. 309–326, 2012.
- K. Zhou, H. Zha, and L. Song, “Learning social infectivity in sparse low-rank networks using multi-dimensional hawkes processes,” in International Conference on Artificial Intelligence and Statistics, 2013. [Online]. Available: https://api.semanticscholar.org/CorpusID:8326502
- M. Farajtabar, Y. Wang, M. Gomez Rodriguez, S. Li, H. Zha, and L. Song, “Coevolve: A joint point process model for information diffusion and network co-evolution,” Advances in Neural Information Processing Systems, vol. 28, 2015.
- B. Liu and X. Huang, “Link-aware link prediction over temporal graph by pattern recognition,” in International Conference on Industrial, Engineering and Other Applications of Applied Intelligent Systems, 2023. [Online]. Available: https://api.semanticscholar.org/CorpusID:260170307
- Q. Zhang, A. Lipani, and E. Yilmaz, “Learning neural point processes with latent graphs,” Proceedings of the Web Conference 2021, 2021.
- F. Ding, J. Yan, and H. Wang, “C-ntpp: Learning cluster-aware neural temporal point process,” in AAAI Conference on Artificial Intelligence, 2023. [Online]. Available: https://api.semanticscholar.org/CorpusID:259739114
- Y. Zhang and J. Yan, “Neural relation inference for multi-dimensional temporal point processes via message passing graph,” in International Joint Conference on Artificial Intelligence, 2021. [Online]. Available: https://api.semanticscholar.org/CorpusID:237100690
- N. Du, H. Dai, R. Trivedi, U. Upadhyay, M. Gomez-Rodriguez, and L. Song, “Recurrent marked temporal point processes: Embedding event history to vector,” in Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, 2016, pp. 1555–1564.
- G. Waghmare, A. Debnath, S. Asthana, and A. Malhotra, “Modeling inter-dependence between time and mark in multivariate temporal point processes,” in Proceedings of the 31st ACM International Conference on Information & Knowledge Management, 2022, pp. 1986–1995.
- H. Mei and J. M. Eisner, “The neural hawkes process: A neurally self-modulating multivariate point process,” Advances in Neural Information Processing Systems, vol. 30, 2017.
- O. Shchur, A. C. Türkmen, T. Januschowski, and S. Günnemann, “Neural temporal point processes: A review,” ArXiv, vol. abs/2104.03528, 2021.
- S. Xue, X. Shi, Z. Chu, Y. Wang, H. Hao, F. Zhou, C. JIANG, C. Pan, J. Y. Zhang, Q. Wen et al., “Easytpp: Towards open benchmarking temporal point processes,” in The Twelfth International Conference on Learning Representations, 2023.
- H. Lin, C. Tan, L. Wu, Z. Gao, S. Li et al., “An empirical study: Extensive deep temporal point process,” ArXiv, vol. abs/2110.09823, 2021.
- H. Lin, C. Tan, L. Wu, Z. Gao, and S. Z. Li, “Extensive deep temporal point process,” ArXiv, vol. abs/2110.09823, 2021.
- J. Yan, “Recent advance in temporal point process: from machine learning perspective,” SJTU Technical Report, 2019.
- A. Soen, A. Mathews, D. Grixti-Cheng, and L. Xie, “Unipoint: Universally approximating point processes intensities,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol. 35, no. 11, 2021, pp. 9685–9694.
- S. Zuo, H. Jiang, Z. Li, T. Zhao, and H. Zha, “Transformer hawkes process,” in International Conference on Machine Learning. PMLR, 2020, pp. 11 692–11 702.
- Q. Zhang, A. Lipani, O. Kirnap, and E. Yilmaz, “Self-attentive hawkes process,” in International Conference on Machine Learning. PMLR, 2020, pp. 11 183–11 193.
- T. Omi, K. Aihara et al., “Fully neural network based model for general temporal point processes,” Advances in Neural Information Processing Systems, vol. 32, 2019.
- O. Shchur, M. Biloš, and S. Günnemann, “Intensity-free learning of temporal point processes,” in International Conference on Learning Representations, 2020.
- D. R. Cox and P. Lewis, “Multivariate point processes,” Point Processes, 2018. [Online]. Available: https://api.semanticscholar.org/CorpusID:118739078
- A. G. Hawkes, “Spectra of some self-exciting and mutually exciting point processes,” Biometrika, vol. 58, no. 1, pp. 83–90, 1971.
- V. Isham and M. Westcott, “A self-correcting point process,” Stochastic Processes and their Applications, vol. 8, no. 3, pp. 335–347, 1979.
- S. Dash, X. She, and S. Mukhopadhyay, “Learning point processes using recurrent graph network,” in 2022 International Joint Conference on Neural Networks (IJCNN). IEEE, 2022, pp. 1–8.
- A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, Ł. Kaiser, and I. Polosukhin, “Attention is all you need,” Advances in Neural Information Processing Systems, vol. 30, 2017.
- Y. Gu, “Attentive neural point processes for event forecasting,” in AAAI Conference on Artificial Intelligence, 2021. [Online]. Available: https://api.semanticscholar.org/CorpusID:235306506
- U. Upadhyay, A. De, and M. Gomez Rodriguez, “Deep reinforcement learning of marked temporal point processes,” Advances in neural information processing systems, vol. 31, 2018.
- J. Sill, “Monotonic networks,” Advances in neural information processing systems, vol. 10, 1997.
- P. M. Chilinski and R. Silva, “Neural likelihoods via cumulative distribution functions,” in Conference on Uncertainty in Artificial Intelligence, 2018. [Online]. Available: https://api.semanticscholar.org/CorpusID:53294779
- D. Rindt, R. Hu, D. Steinsaltz, and D. Sejdinovic, “Survival regression with proper scoring rules and monotonic neural networks,” in International Conference on Artificial Intelligence and Statistics. PMLR, 2022, pp. 1190–1205.
- B. Liu, “Interpretable neural temporal point processes for modelling electronic health records,” ArXiv, vol. abs/2404.08007, 2024.
- J. Chung, C. Gulcehre, K. Cho, and Y. Bengio, “Empirical evaluation of gated recurrent neural networks on sequence modeling,” in NeurIPS 2014 Workshop on Deep Learning, December 2014, 2014.
- S. Hochreiter and J. Schmidhuber, “Long short-term memory,” Neural Computation, vol. 9, no. 8, pp. 1735–1780, 1997.