Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Joint Modeling of Event Sequence and Time Series with Attentional Twin Recurrent Neural Networks (1703.08524v1)

Published 24 Mar 2017 in cs.LG

Abstract: A variety of real-world processes (over networks) produce sequences of data whose complex temporal dynamics need to be studied. More especially, the event timestamps can carry important information about the underlying network dynamics, which otherwise are not available from the time-series evenly sampled from continuous signals. Moreover, in most complex processes, event sequences and evenly-sampled times series data can interact with each other, which renders joint modeling of those two sources of data necessary. To tackle the above problems, in this paper, we utilize the rich framework of (temporal) point processes to model event data and timely update its intensity function by the synergic twin Recurrent Neural Networks (RNNs). In the proposed architecture, the intensity function is synergistically modulated by one RNN with asynchronous events as input and another RNN with time series as input. Furthermore, to enhance the interpretability of the model, the attention mechanism for the neural point process is introduced. The whole model with event type and timestamp prediction output layers can be trained end-to-end and allows a black-box treatment for modeling the intensity. We substantiate the superiority of our model in synthetic data and three real-world benchmark datasets.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (6)
  1. Shuai Xiao (31 papers)
  2. Junchi Yan (241 papers)
  3. Mehrdad Farajtabar (56 papers)
  4. Le Song (140 papers)
  5. Xiaokang Yang (210 papers)
  6. Hongyuan Zha (136 papers)
Citations (44)

Summary

We haven't generated a summary for this paper yet.