Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 175 tok/s
Gemini 2.5 Pro 52 tok/s Pro
GPT-5 Medium 36 tok/s Pro
GPT-5 High 38 tok/s Pro
GPT-4o 92 tok/s Pro
Kimi K2 218 tok/s Pro
GPT OSS 120B 442 tok/s Pro
Claude Sonnet 4.5 38 tok/s Pro
2000 character limit reached

Joint Learning of Panel VAR models with Low Rank and Sparse Structure (2509.15402v1)

Published 18 Sep 2025 in stat.ME

Abstract: Panel vector auto-regressive (VAR) models are widely used to capture the dynamics of multivariate time series across different subpopulations, where each subpopulation shares a common set of variables. In this work, we propose a panel VAR model with a shared low-rank structure, modulated by subpopulation-specific weights, and complemented by idiosyncratic sparse components. To ensure parameter identifiability, we impose structural constraints that lead to a nonsmooth, nonconvex optimization problem. We develop a multi-block Alternating Direction Method of Multipliers (ADMM) algorithm for parameter estimation and establish its convergence under mild regularity conditions. Furthermore, we derive consistency guarantees for the proposed estimators under high-dimensional scaling. The effectiveness of the proposed modeling framework and estimators is demonstrated through experiments on both synthetic data and a real-world neuroscience data set.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.