Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
86 tokens/sec
GPT-4o
11 tokens/sec
Gemini 2.5 Pro Pro
52 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
3 tokens/sec
DeepSeek R1 via Azure Pro
33 tokens/sec
2000 character limit reached

Enabling Tensor Decomposition for Time-Series Classification via A Simple Pseudo-Laplacian Contrast (2409.15200v1)

Published 23 Sep 2024 in cs.LG

Abstract: Tensor decomposition has emerged as a prominent technique to learn low-dimensional representation under the supervision of reconstruction error, primarily benefiting data inference tasks like completion and imputation, but not classification task. We argue that the non-uniqueness and rotation invariance of tensor decomposition allow us to identify the directions with largest class-variability and simple graph Laplacian can effectively achieve this objective. Therefore we propose a novel Pseudo Laplacian Contrast (PLC) tensor decomposition framework, which integrates the data augmentation and cross-view Laplacian to enable the extraction of class-aware representations while effectively capturing the intrinsic low-rank structure within reconstruction constraint. An unsupervised alternative optimization algorithm is further developed to iteratively estimate the pseudo graph and minimize the loss using Alternating Least Square (ALS). Extensive experimental results on various datasets demonstrate the effectiveness of our approach.

Citations (1)

Summary

We haven't generated a summary for this paper yet.