Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
126 tokens/sec
GPT-4o
28 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

A Privacy Preserving System for Movie Recommendations Using Federated Learning (2303.04689v4)

Published 7 Mar 2023 in cs.IR, cs.CR, and cs.LG

Abstract: Recommender systems have become ubiquitous in the past years. They solve the tyranny of choice problem faced by many users, and are utilized by many online businesses to drive engagement and sales. Besides other criticisms, like creating filter bubbles within social networks, recommender systems are often reproved for collecting considerable amounts of personal data. However, to personalize recommendations, personal information is fundamentally required. A recent distributed learning scheme called federated learning has made it possible to learn from personal user data without its central collection. Consequently, we present a recommender system for movie recommendations, which provides privacy and thus trustworthiness on multiple levels: First and foremost, it is trained using federated learning and thus, by its very nature, privacy-preserving, while still enabling users to benefit from global insights. Furthermore, a novel federated learning scheme, called FedQ, is employed, which not only addresses the problem of non-i.i.d.-ness and small local datasets, but also prevents input data reconstruction attacks by aggregating client updates early. Finally, to reduce the communication overhead, compression is applied, which significantly compresses the exchanged neural network parametrizations to a fraction of their original size. We conjecture that this may also improve data privacy through its lossy quantization stage.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (133)
  1. Joint Multi-Grain Topic Sentiment. Information Sciences 339, C (April 2016), 206–223. https://doi.org/10.1016/j.ins.2016.01.013
  2. Federated Recommenders: Methods, Challenges and Future. Cluster Computing 25, 6 (June 2022), 4075–4096. https://doi.org/10.1007/s10586-022-03644-w
  3. Federated Collaborative Filtering for Privacy-Preserving Personalized Recommendation System. arXiv e-prints abs/1901.09888 (Jan. 2019), 12 pages. arXiv:1901.09888 [cs.IR]
  4. A Comprehensive Survey on Privacy-Preserving Techniques in Federated Recommendation Systems. Applied Sciences 13, 10 (2023), 26 pages. https://doi.org/10.3390/app13106201
  5. Natural Language Processing with Python. O’Reilly Media Inc., Sebastopol, California, United States of America.
  6. Learning Differentially Private Recurrent Language Models. In International Conference on Learning Representations. OpenReview.net, Vancouver, British Columbia, Canada, 14 pages. https://openreview.net/forum?id=BJ0hF1Z0b
  7. Expanding the Reach of Federated Learning by Reducing Client Resource Requirements. arXiv e-prints abs/1812.07210 (Jan. 2019). arXiv:1812.07210 [cs.LG]
  8. LEAF: A Benchmark for Federated Settings. CoRR abs/1812.01097 (Dec. 2019). https://doi.org/10.48550/arXiv.1812.01097 arXiv:1812.01097 [cs.LG]
  9. C2S: Class-aware client selection for effective aggregation in federated learning. High-Confidence Computing 2, 3 (2022), 100068. https://doi.org/10.1016/j.hcc.2022.100068
  10. Learning to Rank: From Pairwise Approach to Listwise Approach. In Proceedings of the 24th International Conference on Machine Learning (Corvalis, Oregon, USA) (ICML ’07). Association for Computing Machinery, New York, NY, USA, 129–136. https://doi.org/10.1145/1273496.1273513
  11. Secure Federated Matrix Factorization. IEEE Intelligent Systems 36, 05 (Sept. 2021), 11–20. https://doi.org/10.1109/MIS.2020.3014880
  12. FedEval: A Holistic Evaluation Framework for Federated Learning. arXiv e-prints abs/2011.09655 (Dec. 2022), 14 pages. https://doi.org/10.48550/arXiv.2011.09655 arXiv:2011.09655 [cs.LG]
  13. Federated Meta-Learning with Fast Convergence and Efficient Communication. arXiv e-prints 1802.07876 (Dec. 2019). https://doi.org/10.48550/arXiv.1802.07876 arXiv:1802.07876 [cs.LG]
  14. Feature-Based Matrix Factorization. arXiv e-prints abs/1109.2271 (Dec. 2011), 12 pages. https://doi.org/10.48550/arXiv.1109.2271 arXiv:1109.2271 [cs.AI]
  15. Optimal Client Sampling for Federated Learning. Transactions on Machine Learning Research 2022, 08 (2022), 32 pages. https://openreview.net/forum?id=8GvRCWKHIL
  16. On the Properties of Neural Machine Translation: Encoder-Decoder Approaches. In Proceedings of SSST-8, Eighth Workshop on Syntax, Semantics and Structure in Statistical Translation. Association for Computational Linguistics, Doha, Qatar, 103–111. https://doi.org/10.3115/v1/W14-4012
  17. Recommendation System With Hierarchical Recurrent Neural Network for Long-Term Time Series. IEEE Access 9, 1 (2021), 72033–72039. https://doi.org/10.1109/ACCESS.2021.3079922
  18. Towards the Limit of Network Quantization. In International Conference on Learning Representations. OpenReview.net, Toulon, France, 14 pages. https://openreview.net/forum?id=rJ8uNptgl
  19. EMNIST: Extending MNIST to handwritten letters. In 2017 International Joint Conference on Neural Networks (IJCNN) (Anchorage, Alaska, United States of America). Institute of Electrical and Electronics Engineers (IEEE), 3 Park Avenue, 17th Floor, New York, NY 10016-5997 USA, 2921–2926. https://doi.org/10.1109/IJCNN.2017.7966217
  20. Deep Neural Networks for YouTube Recommendations. In Proceedings of the 10th ACM Conference on Recommender Systems (Boston, Massachusetts, USA) (RecSys ’16). ACM (Association for Computer Machinery), New York, NY, USA, 191–198. https://doi.org/10.1145/2959100.2959190
  21. Secure Multiparty Computation and Secret Sharing. Cambridge University Press, Cambridge, United Kingdom. https://doi.org/10.1017/CBO9781107337756
  22. Data Leakage in Federated Averaging. arXiv e-prints abs/2206.12395 (2022). https://doi.org/10.48550/ARXIV.2206.12395
  23. Cynthia Dwork. 2008. Differential Privacy: A Survey of Results. In Theory and Applications of Models of Computation, Manindra Agrawal, Dingzhu Du, Zhenhua Duan, and Angsheng Li (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 1–19.
  24. Cynthia Dwork and Aaron Roth. 2014. The Algorithmic Foundations of Differential Privacy. Found. Trends Theor. Comput. Sci. 9, 3-4 (Aug. 2014), 211–407. https://doi.org/10.1561/0400000042
  25. European Parliament. 2016. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation). European Union. https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:32016R0679
  26. Haokun Fang and Qian Quan. 2021. Privacy Preserving Machine Learning with Homomorphic Encryption and Federated Learning. Future Internet 13, 4 (2021), 94. https://doi.org/10.3390/fi13040094
  27. Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks. In Proceedings of the 34th International Conference on Machine Learning - Volume 70 (Sydney, NSW, Australia) (ICML’17). JMLR.org, 1269 Law Street, San Diego, CA 92109, 1126–1135.
  28. Federated Multi-view Matrix Factorization for Personalized Recommendations. In Machine Learning and Knowledge Discovery in Databases, Frank Hutter, Kristian Kersting, Jefrey Lijffijt, and Isabel Valera (Eds.). Springer International Publishing, Ghent, Belgium, 324–347. https://doi.org/10.1007/978-3-030-67661-2_20
  29. A General Theory for Client Sampling in Federated Learning. In Trustworthy Federated Learning: First International Workshop, FL 2022, Held in Conjunction with IJCAI 2022, Vienna, Austria, July 23, 2022, Revised Selected Papers (Vienna, Austria). Springer-Verlag, Berlin, Heidelberg, 46–58. https://doi.org/10.1007/978-3-031-28996-5_4
  30. Inverting Gradients - How Easy is It to Break Privacy in Federated Learning?. In Proceedings of the 34th International Conference on Neural Information Processing Systems (Vancouver, British Columbia, Canada) (NIPS’20). Curran Associates Inc., Red Hook, NY, USA, Article 1421, 11 pages.
  31. Low-Power Computer Vision (1st ed.). Chapman and Hall/CRC, New York, United States of America, Chapter A Survey of Quantization Methods for Efficient Neural Network Inference, 288–324. https://doi.org/10.1201/9781003162810
  32. Twitter Sentiment Classification using Distant Supervision. CS224N Project Report. Stanford.
  33. Jennifer Golbeck. 2016. User Privacy Concerns with Common Data Used in Recommender Systems. In Social Informatics, Emma Spiro and Yong-Yeol Ahn (Eds.). Springer International Publishing, Cham, 468–480.
  34. Carlos A. Gomez-Uribe and Neil Hunt. 2016. The Netflix Recommender System: Algorithms, Business Value, and Innovation. ACM Trans. Manage. Inf. Syst. 6, 4, Article 13 (Dec. 2016), 19 pages. https://doi.org/10.1145/2843948
  35. Mihajlo Grbovic and Haibin Cheng. 2018. Real-Time Personalization Using Embeddings for Search Ranking at Airbnb. In Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining (London, United Kingdom) (KDD ’18). Association for Computing Machinery, New York, NY, USA, 311–320. https://doi.org/10.1145/3219819.3219885
  36. Patrick J. Grother and Kayee K. Hanaoka. 1995. NIST special database 19 handprinted forms and characters database. Technical Report. National Institute of Standards and Technology. https://doi.org/10.18434/T4H01C
  37. Encoder Optimizations For The NNR Standard On Neural Network Compression. In 2021 IEEE International Conference on Image Processing (ICIP) (Anchorage, Alaska, USA). Institute of Electrical and Electronics Engineers (IEEE), 3 Park Avenue, 17th Floor, New York, NY 10016-5997 USA, 3522–3526. https://doi.org/10.1109/ICIP42928.2021.9506655
  38. Deep Compression: Compressing Deep Neural Network with Pruning, Trained Quantization and Huffman Coding. In 4th International Conference on Learning Representations, ICLR, May 2-4, 2016, Conference Track Proceedings, Yoshua Bengio and Yann LeCun (Eds.). ICLR, San Juan, Puerto Rico. http://arxiv.org/abs/1510.00149
  39. Federated Learning for Mobile Keyboard Prediction. arXiv e-prints abs/1811.03604 (Feb. 2019), 7 pages. arXiv:1811.03604 [cs.CL]
  40. F. Maxwell Harper and Joseph A. Konstan. 2015. The MovieLens Datasets: History and Context. ACM Trans. Interact. Intell. Syst. 5, 4, Article 19 (Dec. 2015), 19 pages. https://doi.org/10.1145/2827872
  41. FedGraphNN: A Federated Learning System and Benchmark for Graph Neural Networks. In 9th International Conference on Learning Representations. OpenReview.net, Virtual Only, 17 pages.
  42. Neural Collaborative Filtering. In Proceedings of the 26th International Conference on World Wide Web (Perth, Australia) (WWW ’17). International World Wide Web Conferences Steering Committee, Republic and Canton of Geneva, CHE, 173–182. https://doi.org/10.1145/3038912.3052569
  43. Erik Hermann. 2022. Artificial intelligence and mass personalization of communication content—An ethical and literacy perspective. New Media & Society 24, 5 (2022), 1258–1277. https://doi.org/10.1177/14614448211022702 arXiv:https://doi.org/10.1177/14614448211022702
  44. Distilling the Knowledge in a Neural Network. In NIPS Deep Learning and Representation Learning Workshop. Morgan-Kaufmann, Montréal, Québec, Canada. http://arxiv.org/abs/1503.02531
  45. Sepp Hochreiter and Jürgen Schmidhuber. 1997. Long Short-Term Memory. Neural Comput. 9, 8 (Nov. 1997), 1735–1780. https://doi.org/10.1162/neco.1997.9.8.1735
  46. FedCAT: Towards Accurate Federated Learning via Device Concatenation. arXiv e-prints abs/2202.12751 (Feb. 2022), 12 pages. arXiv:2202.12751 [cs.LG]
  47. International Organization for Standardization (ISO). 2022. Information technology - Multimedia content description interface — Part 17: Compression of neural networks for multimedia content description and analysis. Standard. International Organization for Standardization (ISO), Geneva, Switzerland.
  48. Sergey Ioffe and Christian Szegedy. 2015. Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift. In Proceedings of the 32nd International Conference on International Conference on Machine Learning - Volume 37 (Lille, France) (ICML’15). JMLR.org, 1269 Law Street, San Diego, CA 92109, 448–456.
  49. Communication-Efficient On-Device Machine Learning: Federated Distillation and Augmentation under Non-IID Private Data. arXiv e-prints abs/1811.11479 (Oct. 2023), 6 pages. arXiv:1811.11479 [cs.LG]
  50. Junjie Jia and Zhipeng Lei. 2021. Personalized Recommendation Algorithm for Mobile Based on Federated Matrix Factorization. Journal of Physics: Conference Series 1802, 3 (March 2021), 032021. https://doi.org/10.1088/1742-6596/1802/3/032021
  51. Personalized federated recommendation system with historical parameter clustering. Journal of Ambient Intelligence and Humanized Computing 14, 8 (02 2022), 10555–10565. https://doi.org/10.1007/s12652-022-03709-z
  52. Federated Learning from Small Datasets. In The Eleventh International Conference on Learning Representations. OpenReview.net, Kigali, Rwanda, 13 pages. https://openreview.net/forum?id=hDDV1lsRV8
  53. SCAFFOLD: Stochastic Controlled Averaging for Federated Learning. In Proceedings of the 37th International Conference on Machine Learning (Proceedings of Machine Learning Research, Vol. 119), Hal Daumé III and Aarti Singh (Eds.). PMLR, virtual, 5132–5143. https://proceedings.mlr.press/v119/karimireddy20a.html
  54. J. Kiefer and J. Wolfowitz. 1952. Stochastic Estimation of the Maximum of a Regression Function. The Annals of Mathematical Statistics 23, 3 (1952), 462–466. http://www.jstor.org/stable/2236690
  55. Efficient Privacy-Preserving Matrix Factorization for Recommendation via Fully Homomorphic Encryption. ACM Trans. Priv. Secur. 21, 4, Article 17 (jun 2018), 30 pages. https://doi.org/10.1145/3212509
  56. Diederik P. Kingma and Jimmy Ba. 2015. Adam: A Method for Stochastic Optimization. In 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, May 7-9, 2015, Conference Track Proceedings, Yoshua Bengio and Yann LeCun (Eds.). International Conference on Learning Representations, 2710 E Corridor Drive, Appleton, WI 54913. http://arxiv.org/abs/1412.6980
  57. Overview of the Neural Network Compression and Representation (NNR) Standard. IEEE Transactions on Circuits and Systems for Video Technology 32, 5 (2022), 3203–3216. https://doi.org/10.1109/TCSVT.2021.3095970
  58. Federated Optimization: Distributed Machine Learning for On-Device Intelligence. CoRR abs/1610.02527 (Oct. 2016), 38 pages. arXiv:1610.02527 http://arxiv.org/abs/1610.02527
  59. Federated Learning: Strategies for Improving Communication Efficiency. In 6th International Conference on Learning Representations. OpenReview.net, Vancouver, British Columbia, Canada, 10 pages. https://openreview.net/forum?id=B1EPYJ-C-
  60. Yehuda Koren. 2008. Factorization Meets the Neighborhood: A Multifaceted Collaborative Filtering Model. In Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (Las Vegas, Nevada, USA) (KDD ’08). Association for Computing Machinery, New York, NY, USA, 426–434. https://doi.org/10.1145/1401890.1401944
  61. Matrix Factorization Techniques for Recommender Systems. Computer 42, 8 (Aug. 2009), 30–37. https://doi.org/10.1109/MC.2009.263
  62. Public attitudes towards algorithmic personalization and use of personal data online: Evidence from Germany, Great Britain, and the United States. Humanities and Social Sciences Communications 8, 1 (2021), 1–11.
  63. Shyong K. “Tony” Lam, Dan Frankowski, and John Riedl. 2006. Do You Trust Your Recommendations? An Exploration of Security and Privacy Issues in Recommender Systems. In Emerging Trends in Information and Communication Security, Günter Müller (Ed.). Springer Berlin Heidelberg, Berlin, Heidelberg, 14–29.
  64. Natalie Lang and Nir Shlezinger. 2022. Joint Privacy Enhancement and Quantization in Federated Learning. In 2022 IEEE International Symposium on Information Theory (ISIT) (Aalto University, Espoo, Finland). Institute of Electrical and Electronics Engineers (IEEE), 3 Park Avenue, 17th Floor, New York, NY 10016-5997 USA, 2040–2045. https://doi.org/10.1109/ISIT50566.2022.9834551
  65. Gradient-based learning applied to document recognition. Proc. IEEE 86, 11 (1998), 2278–2324. https://doi.org/10.1109/5.726791
  66. Optimal Brain Damage. In Advances in Neural Information Processing Systems, D. Touretzky (Ed.), Vol. 2. Morgan-Kaufmann, Denver, Colorado, USA. https://proceedings.neurips.cc/paper/1989/file/6c9882bbac1c7093bd25041881277658-Paper.pdf
  67. Federated Learning for Keyword Spotting. In ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (Brighton, United Kingdom). Institute of Electrical and Electronics Engineers (IEEE), 3 Park Avenue, 17th Floor, New York, NY 10016-5997 USA, 6341–6345. https://doi.org/10.1109/ICASSP.2019.8683546
  68. Federated Optimization for Heterogeneous Networks. In ICML Workshop on Adaptive & Multitask Learning: Algorithms & Systems. OpenReview.net, Long Beach, California, United States of America, 16 pages. https://openreview.net/forum?id=SkgwE5Ss3N
  69. Fair Resource Allocation in Federated Learning. In 8th International Conference on Learning Representations, ICLR 2020, Addis Ababa, Ethiopia, April 26-30, 2020. OpenReview.net, Addis Ababa, Ethiopia. https://openreview.net/forum?id=ByexElSYDr
  70. On the Convergence of FedAvg on Non-IID Data. In International Conference on Learning Representations. OpenReview.net, Addis Ababa, Ethiopia, 26 pages. https://openreview.net/forum?id=HJxNAnVtDS
  71. FedBN: Federated Learning on Non-IID Features via Local Batch Normalization. In International Conference on Learning Representations (ICLR) 2021. OpenReview.net, Vienna, Austria, 27 pages. https://openreview.net/forum?id=6YEQUn0QICG
  72. Fedrec++: Lossless federated recommendation with explicit feedback. In Proceedings of the AAAI conference on artificial intelligence, Vol. 35. AAAI Press, Washington, DC, USA, 4224–4231.
  73. FedNLP: Benchmarking Federated Learning Methods for Natural Language Processing Tasks. In Findings of the Association for Computational Linguistics: NAACL 2022, Marine Carpuat, Marie-Catherine de Marneffe, and Ivan Vladimir Meza Ruiz (Eds.). Association for Computational Linguistics, Seattle, United States of America, 157–175. https://doi.org/10.18653/v1/2022.findings-naacl.13
  74. Fixed Point Quantization of Deep Convolutional Networks. In Proceedings of the 33rd International Conference on International Conference on Machine Learning - Volume 48 (New York, NY, USA) (ICML’16). JMLR.org, 1269 Law Street, San Diego, CA 92109, 2849–2858.
  75. FedRec: Federated Recommendation With Explicit Feedback. IEEE Intelligent Systems 36, 5 (Sept. 2021), 21–30. https://doi.org/10.1109/MIS.2020.3017205
  76. FR-FMSS: Federated Recommendation via Fake Marks and Secret Sharing. In Proceedings of the 15th ACM Conference on Recommender Systems (Amsterdam, Netherlands) (RecSys ’21). Association for Computing Machinery, New York, NY, USA, 668–673. https://doi.org/10.1145/3460231.3478855
  77. Tie-Yan Liu. 2009. Learning to Rank for Information Retrieval. Foundations and Trends in Information Retrieval 3, 3 (March 2009), 225–331. https://doi.org/10.1561/1500000016
  78. A Secure Federated Transfer Learning Framework. IEEE Intelligent Systems 35, 4 (2020), 70–82. https://doi.org/10.1109/MIS.2020.2988525
  79. Deep Learning Face Attributes in the Wild. In 2015 IEEE International Conference on Computer Vision (ICCV) (Santiago, Chile). Institute of Electrical and Electronics Engineers (IEEE), 3 Park Avenue, 17th Floor, New York, NY 10016-5997 USA, 3730–3738. https://doi.org/10.1109/ICCV.2015.425
  80. S. Lloyd. 1982. Least squares quantization in PCM. IEEE Transactions on Information Theory 28, 2 (1982), 129–137. https://doi.org/10.1109/TIT.1982.1056489
  81. Real-World Image Datasets for Federated Learning. arXiv e-prints abs/1910.11089 (Jan. 2021), 8 pages. arXiv:1910.11089 [cs.CV]
  82. How retailers can keep up with consumers. McKinsey & Company. https://www.mckinsey.com/industries/retail/our-insights/how-retailers-can-keep-up-with-consumers
  83. Communication-Efficient Learning of Deep Networks from Decentralized Data. In Proceedings of the 20th International Conference on Artificial Intelligence and Statistics (AISTATS) 2017, Aarti Singh and Jerry Zhu (Eds.), Vol. 54. JMLR, Inc. and Microtome Publishing, Fort Lauderdale, Florida, USA, 1273–1282.
  84. Stronger privacy for federated collaborative filtering with implicit feedback. In Proceedings of the 15th ACM Conference on Recommender Systems. ACM (Association for Computer Machinery), New York, NY, USA, 342–350.
  85. Moving Picture Experts Group (MPEG) working group of ISO/IEC. 2021. MPEG-7: Compression of Neural Networks for Multimedia Content Description and analysis. Standard. Moving Picture Experts Group (MPEG) working group of ISO/IEC, Hannover, DE.
  86. FedFast: Going Beyond Average for Faster Training of Federated Recommender Systems. In KDD ’20: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining (Virtual Event, CA, USA) (KDD ’20). Association for Computing Machinery, New York, New York, USA, 1234–1242. https://doi.org/10.1145/3394486.3403176
  87. DeepCABAC: Plug&Play Compression of Neural Network Weights and Weight Updates. In IEEE International Conference on Image Processing, ICIP 2020, October 25-28, 2020. IEEE, Abu Dhabi, United Arab Emirates, 21–25. https://doi.org/10.1109/ICIP40778.2020.9190821
  88. Mixed Quantization Enabled Federated Learning to Tackle Gradient Inversion Attacks. In 2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW). Institute of Electrical and Electronics Engineers (IEEE), Vancouver, British Columbia, Canada, 5046–5054. https://doi.org/10.1109/CVPRW59228.2023.00533
  89. PyTorch: An Imperative Style, High-Performance Deep Learning Library. In Advances in Neural Information Processing Systems 32, H. Wallach, H. Larochelle, A. Beygelzimer, F. d'Alché-Buc, E. Fox, and R. Garnett (Eds.). Curran Associates, Inc., Vancouver, British Columbia, Canada, 8024–8035. http://papers.neurips.cc/paper/9015-pytorch-an-imperative-style-high-performance-deep-learning-library.pdf
  90. Vasileios Perifanis and Pavlos S. Efraimidis. 2022. Federated Neural Collaborative Filtering. Know.-Based Syst. 242, C (April 2022), 16 pages. https://doi.org/10.1016/j.knosys.2022.108441
  91. Privacy-Preserving Deep Learning via Additively Homomorphic Encryption. IEEE Transactions on Information Forensics and Security 13, 5 (2018), 1333–1345. https://doi.org/10.1109/TIFS.2017.2787987
  92. FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization. In Proceedings of the Twenty Third International Conference on Artificial Intelligence and Statistics (Proceedings of Machine Learning Research, Vol. 108), Silvia Chiappa and Roberto Calandra (Eds.). PMLR, Online, 2021–2031. https://proceedings.mlr.press/v108/reisizadeh20a.html
  93. Federating Recommendations Using Differentially Private Prototypes. Pattern Recogn. 129, C (Sept. 2022), 14 pages. https://doi.org/10.1016/j.patcog.2022.108746
  94. Herbert Robbins and Sutton Monro. 1951. A Stochastic Approximation Method. The Annals of Mathematical Statistics 22, 3 (1951), 400–407. http://www.jstor.org/stable/2236626
  95. Larynx cancer survival model developed through open-source federated learning. Radiotherapy and Oncology 176, 1 (Nov. 2022), 179–186. https://doi.org/10.1016/j.radonc.2022.09.023
  96. Sparse Binary Compression: Towards Distributed Deep Learning with minimal Communication. In 2019 International Joint Conference on Neural Networks, IJCNN 2019 (Proceedings of the International Joint Conference on Neural Networks). Institute of Electrical and Electronics Engineers Inc., Budapest, Hungary. https://doi.org/10.1109/IJCNN.2019.8852172
  97. Robust and Communication-Efficient Federated Learning From Non-i.i.d. Data. IEEE Transactions on Neural Networks and Learning Systems 31, 9 (2020), 3400–3413. https://doi.org/10.1109/TNNLS.2019.2944481
  98. Michael Schrage. 2017. Great Digital Companies Build Great Recommendation Engines. Harvard Business Review. https://hbr.org/2017/08/great-digital-companies-build-great-recommendation-engines
  99. Barry Schwartz. 2004. The Tyranny of Choice. Scientific American 290, 4 (April 2004), 70–75. https://doi.org/10.1038/scientificamerican0404-70
  100. AutoRec: Autoencoders Meet Collaborative Filtering. In Proceedings of the 24th International Conference on World Wide Web (Florence, Italy) (WWW ’15 Companion). Association for Computing Machinery, New York, NY, USA, 111–112. https://doi.org/10.1145/2740908.2742726
  101. Mihye Seol and Taejoon Kim. 2023. Performance Enhancement in Federated Learning by Reducing Class Imbalance of Non-IID Data. Sensors 23, 3 (2023), 16 pages. https://doi.org/10.3390/s23031152
  102. William Shakespeare. 1994. The Complete Works of William Shakespeare. Project Gutenberg, Vol. 100. Project Gutenberg, P.O. Box 2782, Champaign, IL 61825-2782, USA. https://www.gutenberg.org/ebooks/100
  103. Adi Shamir. 1979. How to share a secret. Commun. ACM 22, 11 (1979), 612–613.
  104. Alex Sherstinsky. 2020. Fundamentals of Recurrent Neural Network (RNN) and Long Short-Term Memory (LSTM) network. Physica D: Nonlinear Phenomena 404, 1 (March 2020), 132306. https://doi.org/10.1016/j.physd.2019.132306
  105. Reza Shokri and Vitaly Shmatikov. 2015. Privacy-Preserving Deep Learning. In Proceedings of the 22nd ACM SIGSAC Conference on Computer and Communications Security (Denver, Colorado, USA) (CCS ’15). Association for Computing Machinery, New York, NY, USA, 1310–1321. https://doi.org/10.1145/2810103.2813687
  106. Recommender Systems and Algorithmic Hate. In Proceedings of the 16th ACM Conference on Recommender Systems (Seattle, WA, USA) (RecSys ’22). Association for Computing Machinery, New York, NY, USA, 592–597. https://doi.org/10.1145/3523227.3551480
  107. Julia Stoll. 2022. Devices used to watch online video on demand (VOD) worldwide in 1st quarter 2022 and 2nd quarter 2022. Statista. https://www.statista.com/statistics/1329449/vod-device-usage-share-worldwide/
  108. Adaptive Random Walk Gradient Descent for Decentralized Optimization. In Proceedings of the 39th International Conference on Machine Learning (Proceedings of Machine Learning Research, Vol. 162), Kamalika Chaudhuri, Stefanie Jegelka, Le Song, Csaba Szepesvari, Gang Niu, and Sivan Sabato (Eds.). PMLR, Baltimore, Maryland, USA, 20790–20809. https://proceedings.mlr.press/v162/sun22b.html
  109. A Survey on Federated Recommendation Systems. arXiv e-prints 2301.00767 (March 2023), 15 pages. https://doi.org/10.48550/arXiv.2301.00767 arXiv:2301.00767 [cs.IR]
  110. Jiaxi Tang and Ke Wang. 2018. Personalized Top-N Sequential Recommendation via Convolutional Sequence Embedding. In Proceedings of the Eleventh ACM International Conference on Web Search and Data Mining (Marina Del Rey, CA, USA) (WSDM ’18). Association for Computing Machinery, New York, NY, USA, 565–573. https://doi.org/10.1145/3159652.3159656
  111. Decentralized Learning with Random Walks and Communication-Efficient Adaptive Optimization. In Workshop on Federated Learning: Recent Advances and New Challenges (in Conjunction with NeurIPS 2022). NeurIPS, New Orleans, LA, USA.
  112. Efficient Privacy-Preserving Recommendations Based on Social Graphs. In Proceedings of the 13th ACM Conference on Recommender Systems (Copenhagen, Denmark) (RecSys ’19). Association for Computing Machinery, New York, NY, USA, 78–86. https://doi.org/10.1145/3298689.3347013
  113. On the Unreasonable Effectiveness of Federated Averaging with Heterogeneous Data. arXiv e-prints abs/2206.04723 (June 2022), 21 pages. https://doi.org/10.48550/arXiv.2206.04723 arXiv:2206.04723 [cs.LG]
  114. Demystifying Model Averaging for Communication-Efficient Federated Matrix Factorization. In ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (Toronto, Ontario, Canada). Institute of Electrical and Electronics Engineers (IEEE), 3 Park Avenue, 17th Floor, New York, NY 10016-5997 USA, 3680–3684. https://doi.org/10.1109/ICASSP39728.2021.9413927
  115. Batch Normalization Damages Federated Learning on NON-IID Data: Analysis and Remedy. In ICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). Institute of Electrical and Electronics Engineers (IEEE), 3 Park Avenue, 17th Floor, New York, NY 10016-5997 USA, 1–5. https://doi.org/10.1109/ICASSP49357.2023.10095399
  116. Federated Learning With Differential Privacy: Algorithms and Performance Analysis. Trans. Info. For. Sec. 15, 1 (Jan. 2020), 3454–3469. https://doi.org/10.1109/TIFS.2020.2988575
  117. A Framework for Evaluating Client Privacy Leakages in Federated Learning. In Computer Security – ESORICS 2020, Liqun Chen, Ninghui Li, Kaitai Liang, and Steve Schneider (Eds.). Springer International Publishing, Cham, 545–566.
  118. Overview of the Third Social Media Mining for Health (SMM4H) Shared Tasks at EMNLP 2018. In Proceedings of the 2018 EMNLP Workshop SMM4H: The 3rd Social Media Mining for Health Applications Workshop & Shared Task. Association for Computational Linguistics, Brussels, Belgium, 13–16. https://doi.org/10.18653/v1/W18-5904
  119. DeepCABAC: A Universal Compression Algorithm for Deep Neural Networks. IEEE Journal of Selected Topics in Signal Processing 14, 4 (2020), 700–714. https://doi.org/10.1109/JSTSP.2020.2969554
  120. Communication-efficient federated learning via knowledge distillation. Nature Communications 13, 1 (April 2022), 8 pages. https://doi.org/10.1038/s41467-022-29763-x
  121. MIND: A Large-scale Dataset for News Recommendation. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics, Online, 3597–3606. https://doi.org/10.18653/v1/2020.acl-main.331
  122. Collaborative Denoising Auto-Encoders for Top-N Recommender Systems. In Proceedings of the Ninth ACM International Conference on Web Search and Data Mining (San Francisco, California, USA) (WSDM ’16). Association for Computing Machinery, New York, NY, USA, 153–162. https://doi.org/10.1145/2835776.2835837
  123. Yuxin Wu and Kaiming He. 2020. Group Normalization. International Journal of Computer Vision 128, 3 (01 Mar 2020), 742–755. https://doi.org/10.1007/s11263-019-01198-w
  124. FCMF: Federated collective matrix factorization for heterogeneous collaborative filtering. Knowledge-Based Systems 220, 1 (March 2021), 106946. https://doi.org/10.1016/j.knosys.2021.106946
  125. Yelp. 2021. Yelp Dataset. Yelp Inc. https://www.yelp.com/dataset
  126. Graph Convolutional Neural Networks for Web-Scale Recommender Systems. In Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining (London, United Kingdom) (KDD ’18). Association for Computing Machinery, New York, NY, USA, 974–983. https://doi.org/10.1145/3219819.3219890
  127. Gradient Obfuscation Gives a False Sense of Security in Federated Learning. In Proceedings of the 32nd USENIX Conference on Security Symposium (Anaheim, California, United States of America) (SEC ’23). USENIX Association, USA, Article 357, 18 pages.
  128. Speeding up Heterogeneous Federated Learning with Sequentially Trained Superclients. In 2022 26th International Conference on Pattern Recognition (ICPR) (Montréal, Québec, Canada). Institute of Electrical and Electronics Engineers (IEEE), 3 Park Avenue, 17th Floor, New York, NY 10016-5997 USA, 3376–3382. https://doi.org/10.1109/ICPR56361.2022.9956084
  129. LightFR: Lightweight Federated Recommendation with Privacy-Preserving Matrix Factorization. ACM Trans. Inf. Syst. 41, 2 (Dec. 2022), 1–28. https://doi.org/10.1145/3578361 Just Accepted.
  130. JianFei Zhang and YuChen Jiang. 2021. A vertical federation recommendation method based on clustering and latent factor model. In 2021 International Conference on Electronic Information Engineering and Computer Science (EIECS). Institute of Electrical and Electronics Engineers (IEEE), 3 Park Avenue, 17th Floor, New York, NY 10016-5997 USA, 362–366. https://doi.org/10.1109/EIECS53707.2021.9587935
  131. iDLG: Improved Deep Leakage from Gradients. arXiv e-prints abs/2001.02610 (Jan. 2020), 5 pages. https://doi.org/10.48550/arXiv.2001.02610 arXiv:2001.02610 [cs.LG]
  132. Federated Learning on Non-IID Data: A Survey. Neurocomput. 465, C (Nov. 2021), 371–390. https://doi.org/10.1016/j.neucom.2021.07.098
  133. Deep Leakage from Gradients. In Advances in Neural Information Processing Systems, H. Wallach, H. Larochelle, A. Beygelzimer, F. d'Alché-Buc, E. Fox, and R. Garnett (Eds.), Vol. 32. Curran Associates, Inc., Vancouver, British Columbia, Canada. https://proceedings.neurips.cc/paper/2019/file/60a6c4002cc7b29142def8871531281a-Paper.pdf
Citations (7)

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com