Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
194 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Restricted Isometry Property of Rank-One Measurements with Random Unit-Modulus Vectors (2403.02654v2)

Published 5 Mar 2024 in cs.IT and math.IT

Abstract: The restricted isometry property (RIP) is essential for the linear map to guarantee the successful recovery of low-rank matrices. The existing works show that the linear map generated by the measurement matrices with independent and identically distributed (i.i.d.) entries satisfies RIP with high probability. However, when dealing with non-i.i.d. measurement matrices, such as the rank-one measurements, the RIP compliance may not be guaranteed. In this paper, we show that the RIP can still be achieved with high probability, when the rank-one measurement matrix is constructed by the random unit-modulus vectors. Compared to the existing works, we first address the challenge of establishing RIP for the linear map in non-i.i.d. scenarios. As validated in the experiments, this linear map is memory-efficient, and not only satisfies the RIP but also exhibits similar recovery performance of the low-rank matrices to that of conventional i.i.d. measurement matrices.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (29)
  1. ROP: Matrix recovery via rank-one projections. The Annals of Statistics, 43(1):102–138, 2015.
  2. Tight oracle inequalities for low-rank matrix recovery from a minimal number of noisy random measurements. IEEE Transactions on Information Theory, 57(4):2342–2359, 2011.
  3. Exact matrix completion via convex optimization. Foundations of Computational mathematics, 9(6):717–772, 2009.
  4. The power of convex relaxation: Near-optimal matrix completion. IEEE Transactions on Information Theory, 56(5):2053–2080, 2010.
  5. Phaselift: Exact and stable signal recovery from magnitude measurements via convex programming. Communications on Pure and Applied Mathematics, 66(8):1241–1274, 2013.
  6. Phase retrieval via wirtinger flow: Theory and algorithms. IEEE Transactions on Information Theory, 61(4):1985–2007, 2015.
  7. Harnessing structures in big data via guaranteed low-rank matrix estimation: Recent theory and fast algorithms via convex and nonconvex optimization. IEEE Signal Processing Magazine, 35(4):14–31, 2018.
  8. Exact and stable covariance estimation from quadratic sampling via convex programming. IEEE Transactions on Information Theory, 61(7):4034–4059, 2015.
  9. Nonconvex optimization meets low-rank matrix factorization: An overview. IEEE Transactions on Signal Processing, 67(20):5239–5269, 2019.
  10. An overview of low-rank matrix recovery from incomplete observations. IEEE Journal of Selected Topics in Signal Processing, 10(4):608–622, 2016.
  11. Spatially sparse precoding in millimeter wave mimo systems. IEEE Transactions on Wireless Communications, 13(3):1499–1513, 2014.
  12. Uncertainty quantification for low-rank matrix completion with heterogeneous and sub-exponential noise. In Proceedings of the 25th International Conference on Artificial Intelligence and Statistics, pages 1179–1189. PMLR, 28–30 Mar 2022.
  13. No spurious local minima in nonconvex low rank problems: A unified geometric analysis. In Proceedings of the 34th International Conference on Machine Learning, pages 1233–1242. PMLR, 2017.
  14. Guaranteed rank minimization via singular value projection. Advances in Neural Information Processing Systems, 23, 2010.
  15. Low-rank matrix completion using alternating minimization. In Proceedings of the 45th Annual ACM Symposium on Theory of Computing, pages 665–674, 2013.
  16. Low rank matrix recovery from rank one measurements. Applied and Computational Harmonic Analysis, 42(1):88–116, 2017.
  17. Nonconvex matrix factorization from rank-one measurements. In Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics, pages 1496–1505. PMLR, 2019.
  18. Implicit regularization in nonconvex statistical estimation: Gradient descent converges linearly for phase retrieval and matrix completion. In Proceedings of the 35th International Conference on Machine Learning, pages 3345–3354. PMLR, 2018.
  19. Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization. SIAM review, 52(3):471–501, 2010.
  20. Counting abelian squares. arXiv preprint arXiv:0807.5028, 2008.
  21. Phase retrieval with application to optical imaging: a contemporary overview. IEEE signal processing magazine, 32(3):87–109, 2015.
  22. Scaling and scalability: Provable nonconvex low-rank tensor completion. In Proceedings of the 25th International Conference on Artificial Intelligence and Statistics, pages 2607–2617. PMLR, 28–30 Mar 2022.
  23. Low-rank solutions of linear matrix equations via procrustes flow. In Proceedings of the 33rd International Conference on Machine Learning, pages 964–973. PMLR, 2016.
  24. How much restricted isometry is needed in nonconvex matrix recovery? Advances in Neural Information Processing Systems, 31, 2018a.
  25. Cost-efficient RIS-aided channel estimation via rank-one matrix factorization. IEEE Wireless Communications Letters, 10(11):2562–2566, 2021.
  26. Leveraging the restricted isometry property: Improved low-rank subspace decomposition for hybrid millimeter-wave systems. IEEE Transactions on Communications, 66(11):5814–5827, 2018b.
  27. Leveraging subspace information for low-rank matrix reconstruction. Signal Processing, 163:123–131, 2019. ISSN 0165-1684.
  28. A sequential subspace method for millimeter wave MIMO channel estimation. IEEE Transactions on Vehicular Technology, 69(5):5355–5368, 2020.
  29. A convergent gradient descent algorithm for rank minimization and semidefinite programming from random linear measurements. Advances in Neural Information Processing Systems, 28, 2015.

Summary

We haven't generated a summary for this paper yet.