Papers
Topics
Authors
Recent
2000 character limit reached

DMSC: Dynamic Multi-Scale Coordination Framework for Time Series Forecasting (2508.02753v2)

Published 3 Aug 2025 in cs.LG and cs.AI

Abstract: Time Series Forecasting (TSF) faces persistent challenges in modeling intricate temporal dependencies across different scales. Despite recent advances leveraging different decomposition operations and novel architectures based on CNN, MLP or Transformer, existing methods still struggle with static decomposition strategies, fragmented dependency modeling, and inflexible fusion mechanisms, limiting their ability to model intricate temporal dependencies. To explicitly solve the mentioned three problems respectively, we propose a novel Dynamic Multi-Scale Coordination Framework (DMSC) with Multi-Scale Patch Decomposition block (EMPD), Triad Interaction Block (TIB) and Adaptive Scale Routing MoE block (ASR-MoE). Specifically, EMPD is designed as a built-in component to dynamically segment sequences into hierarchical patches with exponentially scaled granularities, eliminating predefined scale constraints through input-adaptive patch adjustment. TIB then jointly models intra-patch, inter-patch, and cross-variable dependencies within each layer's decomposed representations. EMPD and TIB are jointly integrated into layers forming a multi-layer progressive cascade architecture, where coarse-grained representations from earlier layers adaptively guide fine-grained feature extraction in subsequent layers via gated pathways. And ASR-MoE dynamically fuses multi-scale predictions by leveraging specialized global and local experts with temporal-aware weighting. Comprehensive experiments on thirteen real-world benchmarks demonstrate that DMSC consistently maintains state-of-the-art (SOTA) performance and superior computational efficiency for TSF tasks. Code is available at https://github.com/1327679995/DMSC.

Summary

  • The paper presents DMSC, a novel framework that dynamically segments time series data and fuses predictions using multi-scale coordination.
  • It employs a lightweight EMPD for adaptive patch decomposition and a Triad Interaction Block to model intra-, inter-, and cross-variable dependencies.
  • Experimental results on 13 benchmarks demonstrate state-of-the-art performance, confirming the effectiveness of each framework component.

DMSC: Dynamic Multi-Scale Coordination Framework for Time Series Forecasting

Introduction

The "DMSC: Dynamic Multi-Scale Coordination Framework for Time Series Forecasting" presents a novel approach to enhancing the accuracy and efficiency of Time Series Forecasting (TSF) by tackling the limitations of existing models. The framework introduces three main components—Embedded Multi-Scale Patch Decomposition (EMPD), Triad Interaction Block (TIB), and Adaptive Scale Routing Mixture-of-Experts (ASR-MoE)—each addressing specific challenges in TSF by dynamically modeling multi-scale dependencies and optimizing prediction fusions.

Multi-Layer Progressive Cascade Architecture

The proposed architecture is designed to capture hierarchical features through the integration of EMPD and TIB into a cascade structure. This setup allows the dynamic adjustment of patch granularities and the joint modeling of dependencies: Figure 1

Figure 1: Visualization of different experts on ETTh1 datasets, the look-back and prediction length are set to 96.

  • EMPD dynamically segments time series inputs into hierarchical patches by using a lightweight neural controller to adjust patch lengths based on input characteristics. This dynamic segmentation is superior to static fixed-length patches, allowing for improved feature extraction at multiple granularity levels.
  • TIB models intra-patch, inter-patch, and cross-variable dependencies, integrating these heterogeneous dependencies through gated feature fusion. The block enhances the model’s ability to extract detailed representations of temporal patterns.

Adaptive Scale Routing Mixture-of-Experts (ASR-MoE)

ASR-MoE innovatively addresses the fusion of predictions by utilizing specialized experts:

  • Global Experts capture long-term trends, while Local Experts focus on short-term variations. The dynamic routing mechanism weights the scale-specific predictions based on current temporal patterns observed in the data.
  • Temporal-Aware Weighting allows ASR-MoE to effectively prioritize different scales and experts, ensuring that short-term fluctuations and long-term trends are both incorporated into the final predictions.

Experimental Results

Comprehensive experiments on 13 real-world TSF benchmarks demonstrate the superiority of DMSC. The model maintains state-of-the-art (SOTA) performance across various datasets, outperforming existing approaches like Transformer-based and CNN-based models: Figure 2

Figure 2: Forecasting results with varying look-back length on Electricity dataset. Look-back lengths are set to {48, 96, 192, 336, 720}.

  • DMSC consistently outperforms baselines such as TimeMixer and PatchTST, particularly on datasets with complex temporal dependencies. The model exhibits robustness across both short-term and long-term forecasting tasks.

Ablation Studies and Model Analysis

Detailed ablation studies validate the importance of each component of the DMSC framework. The results show that:

  • Removing EMPD, TIB, or ASR-MoE significantly degrades performance, confirming their essential roles in enhancing the model’s predictive capability.
  • EMPD’s dynamic decomposition strategy and TIB’s triad interaction modeling are crucial for capturing multi-scale dependencies and refining predictions. Figure 3

Figure 3

Figure 3: Performance on Electricity dataset.

Conclusion

The DMSC framework offers a robust solution for TSF by effectively modeling multiscale dependencies and optimizing prediction fusions through dynamic coordination. Its modular design is scalable and adaptable, making it suitable for various forecasting scenarios. Future research will focus on extending DMSC to multi-task learning environments and optimizing its applicability across diverse real-world datasets.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 tweet and received 1 like.

Upgrade to Pro to view all of the tweets about this paper: