Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
GPT-5.1
GPT-5.1 96 tok/s
Gemini 3.0 Pro 48 tok/s Pro
Gemini 2.5 Flash 155 tok/s Pro
Kimi K2 197 tok/s Pro
Claude Sonnet 4.5 36 tok/s Pro
2000 character limit reached

Regularized Estimation of the Loading Matrix in Factor Models for High-Dimensional Time Series (2506.11232v1)

Published 12 Jun 2025 in stat.ME

Abstract: High-dimensional data analysis using traditional models suffers from overparameterization. Two types of techniques are commonly used to reduce the number of parameters - regularization and dimension reduction. In this project, we combine them by imposing a sparse factor structure and propose a regularized estimator to further reduce the number of parameters in factor models. A challenge limiting the widespread application of factor models is that factors are hard to interpret, as both factors and the loading matrix are unobserved. To address this, we introduce a penalty term when estimating the loading matrix for a sparse estimate. As a result, each factor only drives a smaller subset of time series that exhibit the strongest correlation, improving the factor interpretability. The theoretical properties of the proposed estimator are investigated. The simulation results are presented to confirm that our algorithm performs well. We apply our method to Hawaii tourism data. The results indicate that two groups drive the number of domestic tourists in Hawaii: visitors from high-latitude states (factor 1) and visitors from inland or low-latitude states (factor 2). It reveals two main reasons people visit Hawaii: (1) to escape the cold and (2) to enjoy the beach and water activities.

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 1 tweet and received 1 like.

Upgrade to Pro to view all of the tweets about this paper: