Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Improved Batching Strategy For Irregular Time-Series ODE (2207.05708v1)

Published 12 Jul 2022 in cs.LG

Abstract: Irregular time series data are prevalent in the real world and are challenging to model with a simple recurrent neural network (RNN). Hence, a model that combines the use of ordinary differential equations (ODE) and RNN was proposed (ODE-RNN) to model irregular time series with higher accuracy, but it suffers from high computational costs. In this paper, we propose an improvement in the runtime on ODE-RNNs by using a different efficient batching strategy. Our experiments show that the new models reduce the runtime of ODE-RNN significantly ranging from 2 times up to 49 times depending on the irregularity of the data while maintaining comparable accuracy. Hence, our model can scale favorably for modeling larger irregular data sets.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (4)
  1. Ting Fung Lam (1 paper)
  2. Yony Bresler (3 papers)
  3. Ahmed Khorshid (1 paper)
  4. Nathan Perlmutter (11 papers)

Summary

We haven't generated a summary for this paper yet.