A Variational Autoencoder for Neural Temporal Point Processes with Dynamic Latent Graphs (2312.16083v2)
Abstract: Continuously-observed event occurrences, often exhibit self- and mutually-exciting effects, which can be well modeled using temporal point processes. Beyond that, these event dynamics may also change over time, with certain periodic trends. We propose a novel variational auto-encoder to capture such a mixture of temporal dynamics. More specifically, the whole time interval of the input sequence is partitioned into a set of sub-intervals. The event dynamics are assumed to be stationary within each sub-interval, but could be changing across those sub-intervals. In particular, we use a sequential latent variable model to learn a dependency graph between the observed dimensions, for each sub-interval. The model predicts the future event times, by using the learned dependency graph to remove the noncontributing influences of past events. By doing so, the proposed model demonstrates its higher accuracy in predicting inter-event times and event types for several real-world event sequences, compared with existing state of the art neural point processes.
- Mutually Regressive Point Processes. In Advances in Neural Information Processing Systems (NeurIPS), 1–12.
- Proximal Graphical Event Models. In Advances in Neural Information Processing Systems (NeurIPS), 1–10.
- Recurrent Marked Temporal Point Processes: Embedding Event History to Vector. In SIGKDD, 1555–1564. New York, NY, USA.
- Shaping Social Activity by Incentivizing Users. In Advances in Neural Information Processing Systems (NeurIPS), 2474–2482.
- Multistage Campaigning in Social Networks. In Advances in Neural Information Processing Systems (NeurIPS), 2–9.
- Dynamic Neural Relational Inference. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 8513–8522.
- Hawkes, A. G. 1971. Spectra of some self-exciting and mutually exciting point processes. Biometrika, 58(1): 83–90.
- Statistical analysis of longitudinal network data with changing composition. Sociological Methods & Research, 32(2): 253–287.
- Auto-Encoding Variational Bayes. In Proceedings of the International Conference on Learning Representations (ICLR), 1–14.
- Neural Relational Inference for Interacting Systems. In Proceedings of the International Conference on Machine Learning (ICML), 2688–2697.
- An Empirical Study: Extensive Deep Temporal Point Process. CoRR.
- Discovering Latent Network Structure in Point Process Data. In Proceedings of the International Conference on Machine Learning (ICML), 1413–1421. Bejing, China.
- The Neural Hawkes Process: A Neurally Self-Modulating Multivariate Point Process. In Advances in Neural Information Processing Systems (NeurIPS), 6757–6767.
- Fully Neural Network based Model for General Temporal Point Processes. In Advances in Neural Information Processing Systems (NeurIPS), 1–11.
- A Variational Point Process Model for Social Event Sequences. In Proceedings of the AAAI Conference on Artificial Intelligence (AAAI), 173–180.
- Stochastic Backpropagation and Approximate Inference in Deep Generative Models. In Proceedings of the International Conference on Machine Learning (ICML), 1278–1286. Bejing, China.
- Geometric Hawkes Processes with Graph Convolutional Recurrent Neural Networks. In Proceedings of the AAAI Conference on Artificial Intelligence (AAAI), 4878–4885.
- Intensity-Free Learning of Temporal Point Processes. In Proceedings of the International Conference on Learning Representations (ICLR), 1–21.
- Deep Reinforcement Learning of Marked Temporal Point Processes. In Advances in Neural Information Processing Systems (NeurIPS), 3172–3182.
- Wasserman, S. 1980. Analyzing Social Networks as Stochastic Processes. Journal of the American Statistical Association, 75(370): 280–294.
- Modeling Event Propagation via Graph Biased Temporal Point Process. IEEE Transactions on Neural Networks and Learning Systems, 1–11.
- Wasserstein Learning of Deep Generative Point Process Models. In Advances in Neural Information Processing Systems (NeurIPS).
- Learning Time Series Associated Event Sequences With Recurrent Point Process Networks. IEEE Transactions on Neural Networks and Learning Systems, 30(10): 3124–3136.
- Dependent Relational Gamma Process Models for Longitudinal Networks. In Proceedings of the International Conference on Machine Learning (ICML), 5551–5560.
- A Poisson Gamma Probabilistic Model for Latent Node-Group Memberships in Dynamic Networks. In Proceedings of the AAAI Conference on Artificial Intelligence (AAAI), 4366–4373.
- The Hawkes Edge Partition Model for Continuous-time Event-based Temporal Networks. In Proceedings of the 36th Conference on Uncertainty in Artificial Intelligence (UAI), 460–469.
- Estimating Latent Population Flows from Aggregated Data via Inversing Multi-Marginal Optimal Transport. In Proceedings of the 2023 SIAM International Conference on Data Mining (SDM), 181–189.
- Self-Attentive Hawkes Process. In Proceedings of the International Conference on Machine Learning (ICML), 11183–11193.
- Learning Neural Point Processes with Latent Graphs. In Proceedings of the international conference on World Wide Web (WWW), 1495–1505. New York, NY, USA.
- Neural Relation Inference for Multi-dimensional Temporal Point Processes via Message Passing Graph. In Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI), 3406–3412.
- Transformer Hawkes Process. In Proceedings of the International Conference on Machine Learning (ICML), 11692–11702.
Sponsor
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.