Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learnable Dynamic Temporal Pooling for Time Series Classification (2104.02577v1)

Published 2 Apr 2021 in cs.LG and cs.AI

Abstract: With the increase of available time series data, predicting their class labels has been one of the most important challenges in a wide range of disciplines. Recent studies on time series classification show that convolutional neural networks (CNN) achieved the state-of-the-art performance as a single classifier. In this work, pointing out that the global pooling layer that is usually adopted by existing CNN classifiers discards the temporal information of high-level features, we present a dynamic temporal pooling (DTP) technique that reduces the temporal size of hidden representations by aggregating the features at the segment-level. For the partition of a whole series into multiple segments, we utilize dynamic time warping (DTW) to align each time point in a temporal order with the prototypical features of the segments, which can be optimized simultaneously with the network parameters of CNN classifiers. The DTP layer combined with a fully-connected layer helps to extract further discriminative features considering their temporal position within an input time series. Extensive experiments on both univariate and multivariate time series datasets show that our proposed pooling significantly improves the classification performance.

Citations (26)

Summary

We haven't generated a summary for this paper yet.