Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
175 tokens/sec
GPT-4o
7 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Preference and Concurrence Aware Bayesian Graph Neural Networks for Recommender Systems (2312.11486v2)

Published 30 Nov 2023 in cs.IR and cs.LG

Abstract: Graph-based collaborative filtering methods have prevailing performance for recommender systems since they can capture high-order information between users and items, in which the graphs are constructed from the observed user-item interactions that might miss links or contain spurious positive interactions in industrial scenarios. The Bayesian Graph Neural Network framework approaches this issue with generative models for the interaction graphs. The critical problem is to devise a proper family of graph generative models tailored to recommender systems. We propose an efficient generative model that jointly considers the preferences of users, the concurrence of items and some important graph structure information. Experiments on four popular benchmark datasets demonstrate the effectiveness of our proposed graph generative methods for recommender systems.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (25)
  1. “Deep neural networks for youtube recommendations,” in Proceedings of the 10th ACM conference on recommender systems, 2016, pp. 191–198.
  2. “Recommender systems,” Communications of the ACM, vol. 40, no. 3, pp. 56–58, 1997.
  3. “Matrix factorization techniques for recommender systems,” Computer, vol. 42, no. 8, pp. 30–37, 2009.
  4. Yehuda Koren, “Factorization meets the neighborhood: a multifaceted collaborative filtering model,” in Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining, 2008, pp. 426–434.
  5. Thomas Hofmann, “Latent semantic models for collaborative filtering,” ACM Transactions on Information Systems (TOIS), vol. 22, no. 1, pp. 89–115, 2004.
  6. “Lightgcn: Simplifying and powering graph convolution network for recommendation,” in Proceedings of the 43rd International ACM SIGIR conference on research and development in Information Retrieval, 2020, pp. 639–648.
  7. “Deepfm: a factorization-machine based neural network for ctr prediction,” in Proceedings of the 26th International Joint Conference on Artificial Intelligence, 2017, pp. 1725–1731.
  8. “Convolutional neural networks on graphs with fast localized spectral filtering,” Advances in neural information processing systems, vol. 29, 2016.
  9. “Inductive representation learning on large graphs,” Advances in neural information processing systems, vol. 30, 2017.
  10. “Semi-supervised classification with graph convolutional networks,” in J. International Conference on Learning Representations (ICLR 2017), 2016.
  11. “Graph neural networks in recommender systems: a survey,” ACM Computing Surveys (CSUR), 2020.
  12. “Graph convolutional neural networks for web-scale recommender systems,” in Proceedings of the 24th ACM SIGKDD international conference on knowledge discovery & data mining, 2018, pp. 974–983.
  13. “Multi-graph convolution collaborative filtering,” in 2019 IEEE International Conference on Data Mining (ICDM). IEEE, 2019, pp. 1306–1311.
  14. “Eflec: Efficient feature-leakage correction in gnn based recommendation systems,” in Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval, 2022, pp. 1885–1889.
  15. “Bayesian graph convolutional neural networks for semi-supervised classification,” in Proceedings of the AAAI conference on artificial intelligence, 2019, vol. 33, pp. 5829–5836.
  16. “A framework for recommending accurate and diverse items using bayesian graph convolutional neural networks,” Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, 2020.
  17. “Node copying: A random graph model for effective graph sampling,” Signal Processing, vol. 192, pp. 108335, 2022.
  18. “Stochastic blockmodels: First steps,” Social Networks, vol. 5, no. 2, pp. 109–137, 1983.
  19. “Scalable mcmc for mixed membership stochastic blockmodels,” in Artificial Intelligence and Statistics. PMLR, 2016, pp. 723–731.
  20. “Dbscan revisited, revisited: why and how you should (still) use dbscan,” ACM Transactions on Database Systems (TODS), vol. 42, no. 3, pp. 1–21, 2017.
  21. “Spectral clustering,” in Data clustering, pp. 177–200. Chapman and Hall/CRC, 2018.
  22. “A systematic survey on deep generative models for graph generation,” IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022.
  23. “Graph convolutional matrix completion,” arXiv preprint arXiv:1706.02263, 2017.
  24. “Neural graph collaborative filtering,” in Proceedings of the 42nd international ACM SIGIR conference on Research and development in Information Retrieval, 2019, pp. 165–174.
  25. “Star-gcn: Stacked and reconstructed graph convolutional networks for recommender systems,” in IJCAI, 2019.

Summary

We haven't generated a summary for this paper yet.