Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
144 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
46 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

CP Factor Model for Dynamic Tensors (2110.15517v2)

Published 29 Oct 2021 in stat.ME and econ.EM

Abstract: Observations in various applications are frequently represented as a time series of multidimensional arrays, called tensor time series, preserving the inherent multidimensional structure. In this paper, we present a factor model approach, in a form similar to tensor CP decomposition, to the analysis of high-dimensional dynamic tensor time series. As the loading vectors are uniquely defined but not necessarily orthogonal, it is significantly different from the existing tensor factor models based on Tucker-type tensor decomposition. The model structure allows for a set of uncorrelated one-dimensional latent dynamic factor processes, making it much more convenient to study the underlying dynamics of the time series. A new high order projection estimator is proposed for such a factor model, utilizing the special structure and the idea of the higher order orthogonal iteration procedures commonly used in Tucker-type tensor factor model and general tensor CP decomposition procedures. Theoretical investigation provides statistical error bounds for the proposed methods, which shows the significant advantage of utilizing the special model structure. Simulation study is conducted to further demonstrate the finite sample properties of the estimators. Real data application is used to illustrate the model and its interpretations.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (76)
  1. Eigenvalue ratio test for the number of factors. Econometrica, 81(3):1203–1227.
  2. Identifiability of parameters in latent structure models with many observed variables. The Annals of Statistics, 37(6A):3099 – 3132.
  3. Reconstructing the pathways of a cellular system from genome-scale signals by using matrix and tensor computations. Proceedings of the National Academy of Sciences, 102(49):17559–17564.
  4. A tensor approach to learning mixed membership community models. Journal of Machine Learning Research, 15(1):2239–2312.
  5. Tensor decompositions for learning latent variable models. Journal of Machine Learning Research, 15:2773–2832.
  6. Guaranteed non-orthogonal tensor decomposition via alternating rank-1 updates. arXiv preprint arXiv:1402.5180.
  7. Large dimensional independent component analysis: Statistical optimality and computational tractability. arXiv preprint arXiv:2303.18156.
  8. Perturbation bounds for (nearly) orthogonally decomposable tensors with statistical applications. Information and Inference: A Journal of the IMA, 12(2):1044–1072.
  9. Random matrices and complexity of spin glasses. Communications on Pure and Applied Mathematics, 66(2):165–201.
  10. Bai, J. (2003). Inferential theory for factor models of large dimensions. Econometrica, 71(1):135–171.
  11. Determining the number of factors in approximate factor models. Econometrica, 70(1):191–221.
  12. Determining the number of primitive shocks in factor models. Journal of Business & Economic Statistics, 25(1):52–60.
  13. Identification theory for high dimensional static and dynamic factor models. Journal of Econometrics, 178(2):794–804.
  14. Identification and bayesian estimation of dynamic factor models. Journal of Business & Economic Statistics, 33(2):221–240.
  15. Bekker, P. (1986). A note on the identification of restricted factor loading matrices. Psychometrika, 51:607–611.
  16. Multilayer tensor factorization with applications to recommender systems. The Annals of Statistics, 46(6B):3308–3333.
  17. Time Series Analysis, Forecasting and Control. Holden Day: San Francisco.
  18. Bradley, R. C. (2005). Basic properties of strong mixing conditions. a survey and some open questions. Probability Surveys, 2:107–144.
  19. Arbitrage, factor structure, and mean-variance analysis on large asset markets. Econometrica, 51(5):1281–1304.
  20. Modelling matrix time series via a tensor cp-decomposition. Journal of the Royal Statistical Society Series B: Statistical Methodology, 85(1):127–148.
  21. Statistical inference for high-dimensional matrix-variate factor models. Journal of the American Statistical Association, 118(542):1038–1055.
  22. Constrained factor models for high-dimensional matrix-variate time series. Journal of the American Statistical Association, pages 1–37.
  23. Semi-parametric tensor factor analysis by iteratively projected singular value decomposition. Journal of the Royal Statistical Society Series B: Statistical Methodology, page qkae001.
  24. Factor models for high-dimensional tensor time series. Journal of the American Statistical Association, 117(537):94–116.
  25. Tensor decompositions, alternating least squares and other tales. Journal of Chemometrics: A Journal of the Chemometrics Society, 23(7-8):393–405.
  26. On the best rank-1 and rank-(r1subscript𝑟1r_{1}italic_r start_POSTSUBSCRIPT 1 end_POSTSUBSCRIPT, r2subscript𝑟2r_{2}italic_r start_POSTSUBSCRIPT 2 end_POSTSUBSCRIPT,…, rnsubscript𝑟𝑛r_{n}italic_r start_POSTSUBSCRIPT italic_n end_POSTSUBSCRIPT) approximation of higher-order tensors. SIAM Journal on Matrix Analysis and Applications, 21(4):1324–1342.
  27. High dimensional covariance matrix estimation in approximate factor models. The Annals of Statistics, 39(6):3320.
  28. Large covariance estimation by thresholding principal orthogonal complements. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 75(4):603–680.
  29. Projected principal component analysis in factor models. The Annals of Statistics, 44(1):219–254.
  30. Nonlinear Time Series: Nonparametric and Parametric Methods. Springer Series in Statistics. Springer-Verlag, New York.
  31. The generalized dynamic-factor model: Identification and estimation. The Review of Economics and Statistics, 82(4):540–554.
  32. The generalized dynamic factor model consistency and rates. Journal of Econometrics, 119(2):231–255.
  33. The generalized dynamic factor model: one-sided estimation and forecasting. Journal of the American Statistical Association, 100(471):830–840.
  34. Separable factor analysis with applications to mortality data. Annals of Applied Statistics, 8(1):120.
  35. Determining the number of factors in the general dynamic factor model. Journal of the American Statistical Association, 102(478):603–617.
  36. Guaranteed functional tensor singular value decomposition. Journal of the American Statistical Association, pages 1–13.
  37. Tensor factor model estimation by iterative projection. arXiv preprint arXiv:2006.02611.
  38. Rank determination in tensor factor model. Electronic Journal of Statistics, 16(1):1726–1803.
  39. Sparse and low-rank tensor estimation via cubic sketchings. IEEE Transactions on Information Theory, 66(9):5927–5964.
  40. Håstad, J. (1990). Tensor rank is NP-complete. Journal of Algorithms, 11(4):644–654.
  41. Most tensor problems are np-hard. Journal of the ACM (JACM), 60(6):1–39.
  42. Hoff, P. D. (2011). Separable covariance arrays via the tucker product, with applications to multivariate relational data. Bayesian Analysis, 6(2):179–196.
  43. Hoff, P. D. (2015). Multilinear tensor regression for longitudinal relational data. Annals of Applied Statistics, 9(3):1169.
  44. Provable models for robust low-rank tensor completion. Pacific Journal of Optimization, 11(2):339–364.
  45. Learning mixtures of discrete product distributions using spectral decompositions. In Conference on Learning Theory, pages 824–856. PMLR.
  46. Tensor decompositions and applications. SIAM Review, 51(3):455–500.
  47. Tensor factorization via matrix factorization. In Artificial Intelligence and Statistics, pages 507–516. PMLR.
  48. Factor modeling for high-dimensional time series: inference for the number of factors. The Annals of Statistics, 40(2):694–726.
  49. Estimation of latent factors for high-dimensional time series. Biometrika, 98(4):901–918.
  50. Tensor completion for estimating missing values in visual data. IEEE Transactions on Pattern Analysis and Machine Intelligence, 35(1):208–220.
  51. Generalized higher-order orthogonal iteration for tensor decomposition and completion. In Advances in Neural Information Processing Systems, pages 1763–1771.
  52. A bernstein type inequality and moderate deviations for weakly dependent sequences. Probability Theory and Related Fields, 151(3-4):435–474.
  53. Neudecker, H. (1990). On the identification of restricted factor loading matrices: An alternative condition. Journal of Mathematical Psychology, 34(2):237–241.
  54. A tensor higher-order singular value decomposition for integrative analysis of dna microarray data from different studies. Proceedings of the National Academy of Sciences, 104(47):18371–18376.
  55. Modelling multiple time series via common factors. Biometrika, 95(2):365–379.
  56. Identifying a simplifying structure in time series. Journal of the American statistical Association, 82(399):836–843.
  57. Reinsel, G. C. (2003). Elements of Multivariate Time Series Analysis. Springer.
  58. A statistical model for tensor PCA. In Advances in Neural Information Processing Systems, volume 27.
  59. Rosenblatt, M. (2012). Markov Processes, Structure and Asymptotic Behavior: Structure and Asymptotic Behavior, volume 184. Springer Science & Business Media.
  60. Orthogonalized ALS: A theoretically principled tensor decomposition algorithm for practical use. In International Conference on Machine Learning, pages 3095–3104. PMLR.
  61. Time Series Analysis and Its Applications: With R Examples. Springer.
  62. Forecasting using principal components from a large number of predictors. Journal of the American Statistical Association, 97(460):1167–1179.
  63. STORE: sparse tensor response regression and neuroimaging analysis. Journal of Machine Learning Research, 18(1):4908–4944.
  64. Provable sparse tensor decomposition. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 3(79):899–916.
  65. Tong, H. (1990). Non-linear Time Series: a Dynamical System Approach. Oxford University Press.
  66. Tsay, R. S. (2005). Analysis of Financial Time Series, volume 543. John Wiley & Sons.
  67. Nonlinear Time Series Analysis, volume 891. John Wiley & Sons.
  68. Factor models for matrix-valued high-dimensional time series. Journal of Econometrics, 208(1):231–248.
  69. Learning from binary multiway data: Probabilistic tensor decomposition and its statistical optimality. Journal of Machine Learning Research, 21(154):1–38.
  70. Tensor decompositions via two-mode higher-order SVD (HOSVD). In Artificial Intelligence and Statistics, pages 614–622. PMLR.
  71. Tensor decomposition via simultaneous power iteration. In International Conference on Machine Learning, pages 3665–3673. PMLR.
  72. Wedin, P.-Å. (1972). Perturbation bounds in connection with singular value decomposition. BIT Numerical Mathematics, 12(1):99–111.
  73. Spectral state compression of markov processes. IEEE Transactions on Information Theory, 66(5):3202–3231.
  74. Tensor SVD: Statistical and computational limits. IEEE Transactions on Information Theory, 64(11):7311–7338.
  75. Tensor regression with applications in neuroimaging data analysis. Journal of the American Statistical Association, 108(502):540–552.
  76. Learning markov models via low-rank optimization. Operations Research, 70(4):2384–2398.
Citations (17)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com