Learning Low-Complexity Autoregressive Models via Proximal Alternating Minimization
Abstract: We consider the estimation of the state transition matrix in vector autoregressive models, when time sequence data is limited but nonsequence steady-state data is abundant. To leverage both sources of data, we formulate the least squares minimization problem regularized by a Lyapunov penalty. We impose cardinality or rank constraints to reduce the complexity of the autoregressive model. We solve the resulting nonconvex, nonsmooth problem by using the proximal alternating linearization method (PALM). We show that PALM is globally convergent to a critical point and that the estimation error monotonically decreases. Furthermore, we obtain explicit formulas for the proximal operators to facilitate the implementation of PALM. We demonstrate the effectiveness of the developed method on synthetic and real-world data. Our experiments show that PALM outperforms the gradient projection method in both computational efficiency and solution quality.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.