Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
153 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
45 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Efficient Federated Low Rank Matrix Completion (2405.06569v2)

Published 10 May 2024 in cs.LG and eess.SP

Abstract: In this work, we develop and analyze a Gradient Descent (GD) based solution, called Alternating GD and Minimization (AltGDmin), for efficiently solving the low rank matrix completion (LRMC) in a federated setting. LRMC involves recovering an $n \times q$ rank-$r$ matrix $\Xstar$ from a subset of its entries when $r \ll \min(n,q)$. Our theoretical guarantees (iteration and sample complexity bounds) imply that AltGDmin is the most communication-efficient solution in a federated setting, is one of the fastest, and has the second best sample complexity among all iterative solutions to LRMC. In addition, we also prove two important corollaries. (a) We provide a guarantee for AltGDmin for solving the noisy LRMC problem. (b) We show how our lemmas can be used to provide an improved sample complexity guarantee for AltMin, which is the fastest centralized solution.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (30)
  1. A. A. Abbasi, S. Moothedath, and N. Vaswani, “Fast federated low rank matrix completion,” in 2023 59th Annual Allerton Conference on Communication, Control, and Computing (Allerton).   IEEE, 2023, pp. 1–6.
  2. E. J. Candes and B. Recht, “Exact matrix completion via convex optimization,” Found. of Comput. Math, no. 9, pp. 717–772, 2008.
  3. P. Netrapalli, P. Jain, and S. Sanghavi, “Low-rank matrix completion using alternating minimization,” in Annual ACM Symp. on Th. of Comp. (STOC), 2013.
  4. M. Fazel, “Matrix rank minimization with applications,” PhD thesis, Stanford Univ, 2002.
  5. R. Keshavan, A. Montanari, and S. Oh, “Matrix completion from a few entries,” IEEE Trans. Info. Th., vol. 56, no. 6, pp. 2980–2998, 2010.
  6. R. Sun and Z.-Q. Luo, “Guaranteed matrix completion via non-convex factorization,” IEEE Trans. Info. Th., vol. 62, no. 11, pp. 6535–6579, 2016.
  7. Y. Cherapanamjeri, K. Gupta, and P. Jain, “Nearly-optimal robust matrix completion,” ICML, 2016.
  8. C. Ma, K. Wang, Y. Chi, and Y. Chen, “Implicit regularization in nonconvex statistical estimation: Gradient descent converges linearly for phase retrieval, matrix completion and blind deconvolution,” in Intl. Conf. Machine Learning (ICML), 2018.
  9. M. Hardt and M. Wootters, “Fast matrix completion without the condition number,” in Conf. on Learning Theory, 2014.
  10. Q. Zheng and J. Lafferty, “Convergence analysis for rectangular matrix completion using burer-monteiro factorization and gradient descent,” arXiv preprint arXiv:1605.07051, 2016.
  11. X. Yi, D. Park, Y. Chen, and C. Caramanis, “Fast algorithms for robust pca via gradient descent,” in Neur. Info. Proc. Sys. (NeurIPS), 2016.
  12. M. Hardt, “Understanding alternating minimization for matrix completion,” in 2014 IEEE 55th Annual Symposium on Foundations of Computer Science.   IEEE, 2014, pp. 651–660.
  13. P. Jain and P. Netrapalli, “Fast exact matrix completion with finite samples,” in Conf. on Learning Theory, 2015, pp. 1007–1034.
  14. L. W. Mackey, A. Talwalkar, and M. I. Jordan, “Distributed matrix completion and robust factorization,” J. Mach. Learn. Res., vol. 16, no. 1, pp. 913–960, 2015.
  15. C. Teflioudi, F. Makari, and R. Gemulla, “Distributed matrix completion,” in 2012 ieee 12th international conference on data mining.   IEEE, 2012, pp. 655–664.
  16. Q. Ling, Y. Xu, W. Yin, and Z. Wen, “Decentralized low-rank matrix completion,” in 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 2012, pp. 2925–2928.
  17. A.-Y. Lin and Q. Ling, “Decentralized and privacy-preserving low-rank matrix completion,” Journal of the Operations Research Society of China, vol. 3, no. 2, pp. 189–205, 2015.
  18. M. Mardani, G. Mateos, and G. Giannakis, “Decentralized sparsity-regularized rank minimization: Algorithms and applications,” IEEE Trans. Sig. Proc., 2013.
  19. Z. Li, B. Ding, C. Zhang, N. Li, and J. Zhou, “Federated matrix factorization with privacy guarantee,” Proceedings of the VLDB Endowment, vol. 15, no. 4, 2021.
  20. V. W. Anelli, Y. Deldjoo, T. Di Noia, A. Ferrara, and F. Narducci, “User-controlled federated matrix factorization for recommender systems,” Journal of Intelligent Information Systems, vol. 58, no. 2, pp. 287–309, 2022.
  21. X. He, Q. Ling, and T. Chen, “Byzantine-robust stochastic gradient descent for distributed low-rank matrix completion,” in 2019 IEEE Data Science Workshop (DSW).   IEEE, 2019, pp. 322–326.
  22. S. Nayer and N. Vaswani, “Fast and sample-efficient federated low rank matrix recovery from column-wise linear and quadratic projections,” IEEE Trans. Info. Th., Feb. 2023, arXiv:2102.10217 (Feb. 2021).
  23. N. Vaswani, “Efficient federated low rank matrix recovery via alternating gd and minimization: A simple proof,” IEEE Trans. Info. Th., 2024, arXiv: 2306.17782.
  24. S. Nayer, P. Narayanamurthy, and N. Vaswani, “Provable low rank phase retrieval,” IEEE Trans. Info. Th., March 2020.
  25. S. Nayer and N. Vaswani, “Sample-efficient low rank phase retrieval,” IEEE Trans. Info. Th., Dec. 2021.
  26. Y. Chen, Y. Chi, J. Fan, and C. Ma, “Spectral methods for data science: A statistical perspective,” Foundations and Trends® in Machine Learning, vol. 14, no. 5, p. 566–806, 2021. [Online]. Available: http://dx.doi.org/10.1561/2200000079
  27. M. Nashed, “A decomposition relative to convex sets,” Proceedings of the American Mathematical Society, vol. 19, no. 4, pp. 782–786, 1968.
  28. E. H. Zarantonello, “Projections on convex sets in hilbert space and spectral theory: Part i. projections on convex sets: Part ii. spectral theory,” in Contributions to nonlinear functional analysis.   Elsevier, 1971, pp. 237–424.
  29. Y. Chen and E. Candes, “Solving random quadratic systems of equations is nearly as easy as solving linear systems,” in Neur. Info. Proc. Sys. (NeurIPS), 2015, pp. 739–747.
  30. Y. Chen, Y. Chi, J. Fan, and C. Ma, “Spectral methods for data science: A statistical perspective,” arXiv preprint arXiv:2012.08496, 2020.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com