Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

InceptionTime: Finding AlexNet for Time Series Classification (1909.04939v3)

Published 11 Sep 2019 in cs.LG and stat.ML

Abstract: This paper brings deep learning at the forefront of research into Time Series Classification (TSC). TSC is the area of machine learning tasked with the categorization (or labelling) of time series. The last few decades of work in this area have led to significant progress in the accuracy of classifiers, with the state of the art now represented by the HIVE-COTE algorithm. While extremely accurate, HIVE-COTE cannot be applied to many real-world datasets because of its high training time complexity in O(N2 * T4) for a dataset with N time series of length T. For example, it takes HIVE-COTE more than 8 days to learn from a small dataset with N = 1500 time series of short length T = 46. Meanwhile deep learning has received enormous attention because of its high accuracy and scalability. Recent approaches to deep learning for TSC have been scalable, but less accurate than HIVE-COTE. We introduce InceptionTime - an ensemble of deep Convolutional Neural Network (CNN) models, inspired by the Inception-v4 architecture. Our experiments show that InceptionTime is on par with HIVE-COTE in terms of accuracy while being much more scalable: not only can it learn from 1,500 time series in one hour but it can also learn from 8M time series in 13 hours, a quantity of data that is fully out of reach of HIVE-COTE.

Citations (928)

Summary

  • The paper introduces an ensemble of Inception-based CNN models that match HIVE-COTE’s accuracy while drastically reducing training time.
  • The paper demonstrates that InceptionTime scales efficiently by training 1,500 time series in one hour and processing 8 million series in 13 hours.
  • The paper provides an in-depth analysis of hyperparameters and architectural choices, highlighting bottlenecks and residual connections for improved convergence.

InceptionTime: A Novel Deep Learning Approach for Time Series Classification

"InceptionTime: Finding AlexNet for Time Series Classification” presents a novel deep learning framework developed to address the challenges in Time Series Classification (TSC). The current state-of-the-art classifier, HIVE-COTE, demonstrates significant accuracy but suffers from high computational complexity, limiting its applicability to larger datasets. This paper introduces InceptionTime, an ensemble of deep Convolutional Neural Network (CNN) models inspired by the Inception-v4 architecture, which aims to match HIVE-COTE's accuracy while ensuring scalability.

The InceptionTime model is extensively evaluated on the UCR archive, a well-established benchmark for TSC, and demonstrates parity with HIVE-COTE in terms of accuracy. More notably, InceptionTime provides a scalable solution, significantly reducing training time. For instance, InceptionTime can train on 1,500 time series within one hour, compared to HIVE-COTE's eight days of training on a similar dataset. Additionally, InceptionTime can seamlessly handle 8 million time series in 13 hours, highlighting its practical advantages for large-scale applications.

Key Contributions and Findings

  1. Model Architecture: The InceptionTime model adopts an Inception-based architecture tailored for TSC. Each Inception module applies filters of varying lengths, allowing the network to capture features from both short and long time series effectively. This architecture includes bottleneck layers to reduce dimensionality and residual connections to facilitate gradient flow.
  2. Empirical Evaluation: The model achieves state-of-the-art accuracy on the UCR archive, comparable to HIVE-COTE, while being orders of magnitude faster. On datasets such as InlineSkate and a Satellite Image Time Series (SITS), InceptionTime demonstrated significantly reduced training times compared to HIVE-COTE.
  3. Architectural Insights: Experiments showed that increasing Receptive Field (RF), depth, and filter length positively affects model performance, especially for longer time series. However, longer filters can overfit smaller datasets.
  4. Impact of Hyperparameters: The paper provides comprehensive insights into the effects of varying several architectural hyperparameters, such as batch size, bottleneck, and residual connections. The experiments revealed that residual connections, though not significantly altering accuracy, could aid training convergence. Additionally, longer filters improved performance, highlighting the balance between capturing temporal dependencies and avoiding overfitting.
  5. Ensembling Strategy: To enhance model robustness, InceptionTime ensembles five Inception networks with different weight initializations. This approach mitigates high standard deviation in accuracy observed in single networks. Detailed analysis shows no significant accuracy improvement beyond five models.

Implications and Future Directions

The findings have practical implications in domains producing massive time series datasets, such as healthcare, finance, and remote sensing. The ability to train on millions of time series makes InceptionTime a suitable candidate for real-world applications where scalability is crucial.

Future research directions might include:

  • Multivariate Time Series: Extending InceptionTime to handle multivariate time series, providing further utility across varied domains.
  • Transfer Learning: Leveraging pre-trained models in a transfer learning setup to improve performance on specific, smaller datasets.
  • Further Model Optimizations: Exploring other recent advancements in CNN architectures from the computer vision field to enhance the performance and scalability of InceptionTime.

In conclusion, InceptionTime establishes a new benchmark in TSC by combining accuracy with scalability. This model sets a precedent for further innovations in deep learning applications for time series data, fostering advancements that bridge the gap between theoretical development and practical deployment.