Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Neighborhood-Enhanced Supervised Contrastive Learning for Collaborative Filtering (2402.11523v1)

Published 18 Feb 2024 in cs.IR and cs.AI

Abstract: While effective in recommendation tasks, collaborative filtering (CF) techniques face the challenge of data sparsity. Researchers have begun leveraging contrastive learning to introduce additional self-supervised signals to address this. However, this approach often unintentionally distances the target user/item from their collaborative neighbors, limiting its efficacy. In response, we propose a solution that treats the collaborative neighbors of the anchor node as positive samples within the final objective loss function. This paper focuses on developing two unique supervised contrastive loss functions that effectively combine supervision signals with contrastive loss. We analyze our proposed loss functions through the gradient lens, demonstrating that different positive samples simultaneously influence updating the anchor node's embeddings. These samples' impact depends on their similarities to the anchor node and the negative samples. Using the graph-based collaborative filtering model as our backbone and following the same data augmentation methods as the existing contrastive learning model SGL, we effectively enhance the performance of the recommendation model. Our proposed Neighborhood-Enhanced Supervised Contrastive Loss (NESCL) model substitutes the contrastive loss function in SGL with our novel loss function, showing marked performance improvement. On three real-world datasets, Yelp2018, Gowalla, and Amazon-Book, our model surpasses the original SGL by 10.09%, 7.09%, and 35.36% on NDCG@20, respectively.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (54)
  1. X. Ning and G. Karypis, “SLIM: Sparse Linear Methods for Top-N Recommender Systems,” in ICDM, Dec. 2011, pp. 497–506.
  2. M. Deshpande and G. Karypis, “Item-based top- N recommendation algorithms,” TOIS, vol. 22, no. 1, pp. 143–177, Jan. 2004.
  3. B. Sarwar, G. Karypis, J. Konstan, and J. Riedl, “Analysis of recommendation algorithms for e-commerce,” in EC, 2000, pp. 158–167.
  4. S. Rendle, C. Freudenthaler, Z. Gantner, and L. Schmidt-Thieme, “BPR: Bayesian Personalized Ranking from Implicit Feedback,” in UAI, 2012, pp. 452–461.
  5. X. He, K. Deng, X. Wang, Y. Li, Y. Zhang, and M. Wang, “LightGCN: Simplifying and Powering Graph Convolution Network for Recommendation,” in SIGIR, 2020, pp. 639–648.
  6. Y. Koren, “Factorization Meets the Neighborhood: A Multifaceted Collaborative Filtering Model,” p. 9, 2008.
  7. X. Wang, X. He, M. Wang, F. Feng, and T.-S. Chua, “Neural Graph Collaborative Filtering,” in SIGIR, Jul. 2019, pp. 165–174.
  8. L. Chen, L. Wu, R. Hong, K. Zhang, and M. Wang, “Revisiting Graph Based Collaborative Filtering: A Linear Residual Graph Convolutional Network Approach,” in AAAI, vol. 34, 2020, pp. 27–34.
  9. H. Li, Y. Wang, Z. Lyu, and J. Shi, “Multi-task learning for recommendation over heterogeneous information network,” TKDE, vol. 34, no. 2, pp. 789–802, 2022.
  10. L. Wu, X. He, X. Wang, K. Zhang, and M. Wang, “A Survey on Accuracy-oriented Neural Recommendation: From Collaborative Filtering to Information-rich Recommendation,” IEEE Transactions on Knowledge and Data Engineering, pp. 4425–4445, 2022.
  11. F. Feng, X. He, H. Zhang, and T.-S. Chua, “Cross-gcn: Enhancing graph convolutional network with k𝑘kitalic_k k-order feature interactions,” IEEE Transactions on Knowledge and Data Engineering, vol. 35, no. 1, pp. 225–236, 2021.
  12. P. Khosla, P. Teterwak, C. Wang, A. Sarna, Y. Tian, P. Isola, A. Maschinot, C. Liu, and D. Krishnan, “Supervised Contrastive Learning,” NeurIPS, Mar. 2021.
  13. C. Ting, K. Simon, N. Mohammad, and H. Geoffrey, “A Simple Framework for Contrastive Learning of Visual Representations,” in ICML, 2020.
  14. J. Wu, X. Wang, F. Feng, X. He, L. Chen, J. Lian, and X. Xie, “Self-supervised Graph Learning for Recommendation,” in SIGIR, 2021, p. 10.
  15. J. Yu, H. Yin, X. Xia, T. Chen, L. Cui, and Q. V. H. Nguyen, “Are graph augmentations necessary? simple graph contrastive learning for recommendation,” in SIGIR, 2022, p. 1294–1303.
  16. C. Wang, W. Ma, and C. Chen, “Sequential recommendation with multiple contrast signals,” TOIS, mar 2022.
  17. C. Wang, Y. Yu, W. Ma, M. Zhang, C. Chen, Y. Liu, and S. Ma, “Towards representation alignment and uniformity in collaborative filtering,” in KDD, 2022, p. 1816–1825.
  18. Z. Lin, C. Tian, Y. Hou, and W. X. Zhao, “Improving Graph Collaborative Filtering with Neighborhood-enriched Contrastive Learning,” in WWW, 2022, pp. 2320–2329.
  19. H. Dong, J. Chen, F. Feng, X. He, S. Bi, Z. Ding, and P. Cui, “On the equivalence of decoupled graph convolution network and label propagation,” in Proceedings of the Web Conference 2021, 2021, pp. 3651–3662.
  20. F. Feng, W. Huang, X. He, X. Xin, Q. Wang, and T.-S. Chua, “Should graph convolution trust neighbors? a simple causal inference method,” in Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information Retrieval, 2021, pp. 1208–1218.
  21. K. Mao, J. Zhu, X. Xiao, B. Lu, Z. Wang, and X. He, “UltraGCN: Ultra Simplification of Graph Convolutional Networks for Recommendation,” CIKM, Oct. 2021.
  22. L. Wu, P. Sun, Y. Fu, H. Richang, W. Xiting, and W. Meng, “A neural influence diffusion model for social recommendation,” in SIGIR, 2019, pp. 235–244.
  23. L. Wu, J. Li, P. Sun, R. Hong, Y. Ge, and M. Wang, “DiffNet++: A Neural Influence and Interest Diffusion Network for Social Recommendation,” TKDE, Jan. 2021.
  24. Y. Liu, X. Ao, Z. Qin, J. Chi, J. Feng, H. Yang, and Q. He, “Pick and Choose: A GNN-based Imbalanced Learning Approach for Fraud Detection,” in WWW, Apr. 2021, pp. 3168–3177.
  25. J. Shuai, K. Zhang, L. Wu, P. Sun, R. Hong, M. Wang, and Y. Li, “A review-aware graph contrastive learning framework for recommendation,” in Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval, 2022, pp. 1283–1293.
  26. L. Wu, Y. Yang, K. Zhang, R. Hong, Y. Fu, and M. Wang, “Joint item recommendation and attribute inference: An adaptive graph convolutional network approach,” in Proceedings of the 43rd International ACM SIGIR conference on research and development in Information Retrieval, 2020, pp. 679–688.
  27. J.-B. Grill, F. Strub, F. Altché, C. Tallec, P. H. Richemond, E. Buchatskaya, C. Doersch, B. A. Pires, Z. D. Guo, M. G. Azar, B. Piot, K. Kavukcuoglu, R. Munos, and M. Valko, “Bootstrap Your Own Latent A New Approach to Self-Supervised Learning,” 2020, p. 14.
  28. M. Caron, I. Misra, J. Mairal, P. Goyal, P. Bojanowski, and A. Joulin, “Unsupervised Learning of Visual Features by Contrasting Cluster Assignments,” in NeurIPS, 2021.
  29. K. He, H. Fan, Y. Wu, S. Xie, and R. Girshick, “Momentum Contrast for Unsupervised Visual Representation Learning,” in CVPR, Jun. 2020, pp. 9726–9735.
  30. T. Gao, X. Yao, and D. Chen, “SimCSE: Simple Contrastive Learning of Sentence Embeddings,” in EMNLP, Sep. 2021.
  31. X. Liang, L. Wu, J. Li, Y. Wang, Q. Meng, T. Qin, W. Chen, M. Zhang, and T.-Y. Liu, “R-Drop: Regularized Dropout for Neural Networks,” in NeurIPS, Jun. 2021.
  32. Y. Yan, R. Li, S. Wang, F. Zhang, W. Wu, and W. Xu, “ConSERT: A Contrastive Framework for Self-Supervised Sentence Representation Transfer,” in ACL, 2021.
  33. P. Veličković, W. Fedus, W. L. Hamilton, P. Liò, Y. Bengio, and R. D. Hjelm, “Deep Graph Infomax,” ICLR, 2019.
  34. Y. You, T. Chen, Y. Sui, T. Chen, Z. Wang, and Y. Shen, “Graph Contrastive Learning with Augmentations,” in NeurIPS, 2020.
  35. J. Qiu, Q. Chen, Y. Dong, J. Zhang, H. Yang, M. Ding, K. Wang, and J. Tang, “GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training,” in KDD, Aug. 2020, pp. 1150–1160.
  36. Y. You, T. Chen, Y. Shen, and Z. Wang, “Graph Contrastive Learning Automated,” in ICML, 2021, p. 12.
  37. S. Suresh, P. Li, C. Hao, and J. Neville, “Adversarial Graph Augmentation to Improve Graph Contrastive Learning,” in NeurIPS, 2021, p. 14.
  38. M. Noroozi and P. Favaro, “Unsupervised learning of visual representations by solving jigsaw puzzles,” in Computer Vision–ECCV 2016: 14th European Conference, Amsterdam, The Netherlands, October 11-14, 2016, Proceedings, Part VI.   Springer, 2016, pp. 69–84.
  39. S. Gidaris, P. Singh, and N. Komodakis, “Unsupervised representation learning by predicting image rotations,” arXiv preprint arXiv:1803.07728, 2018.
  40. J. Ma, C. Zhou, H. Yang, P. Cui, X. Wang, and W. Zhu, “Disentangled Self-Supervision in Sequential Recommenders,” in KDD, Aug. 2020, pp. 483–491.
  41. X. Xia, H. Yin, J. Yu, Y. Shao, and L. Cui, “Self-Supervised Graph Co-Training for Session-based Recommendation,” in CIKM, 2021, pp. 2180–2190.
  42. J. Yu, H. Yin, J. Li, Q. Wang, N. Q. V. Hung, and X. Zhang, “Self-Supervised Multi-Channel Hypergraph Convolutional Network for Social Recommendation,” in WWW, Apr. 2021, pp. 413–424.
  43. J. Shuai, K. Zhang, L. Wu, P. Sun, R. Hong, M. Wang, and Y. Li, “A review-aware graph contrastive learning framework for recommendation,” in SIGIR, 2022, p. 1283–1293.
  44. J. Wu, X. Wang, X. Gao, J. Chen, H. Fu, T. Qiu, and X. He, “On the Effectiveness of Sampled Softmax Loss for Item Recommendation,” Jan. 2022.
  45. E. Alesksandr, S. Aliaksandr, S. Enver, and S. Nicu, “Whitening for Self-Supervised Representation Learning,” in ICML, 2021.
  46. L. Xu, J. Lian, W. X. Zhao, M. Gong, L. Shou, D. Jiang, X. Xie, and J.-R. Wen, “Negative Sampling for Contrastive Representation Learning: A Review,” May 2022.
  47. J. Robinson, C.-Y. Chuang, S. Sra, and S. Jegelka, “Contrastive Learning with Hard Negative Samples,” Jan. 2021.
  48. F. Wang and H. Liu, “Understanding the Behaviour of Contrastive Loss,” in 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), Jun. 2021, pp. 2495–2504.
  49. A. Zhang, W. Ma, X. Wang, and T.-S. Chua, “Incorporating Bias-aware Margins into Contrastive Loss for Collaborative Filtering.”
  50. C. Wu, F. Wu, S. Ge, T. Qi, Y. Huang, and X. Xie, “Neural News Recommendation with Multi-Head Self-Attention,” in IJCNLP, 2019, pp. 6388–6393.
  51. P. Sun, L. Wu, K. Zhang, Y. Fu, R. Hong, and M. Wang, “Dual Learning for Explainable Recommendation: Towards Unifying User Preference Prediction and Review Generation,” in WWW, 2020, pp. 837–842.
  52. H. Wang, F. Zhang, X. Xie, and M. Guo, “DKN: Deep Knowledge-Aware Network for News Recommendation,” in WWW, 2018, pp. 1835–1844.
  53. K. Mao, J. Zhu, J. Wang, Q. Dai, Z. Dong, X. Xiao, and X. He, “SimpleX: A Simple and Strong Baseline for Collaborative Filtering,” CIKM, Sep. 2021.
  54. W. X. Zhao, S. Mu, Y. Hou, Z. Lin, Y. Chen, X. Pan, K. Li, Y. Lu, H. Wang, C. Tian, Y. Min, Z. Feng, X. Fan, X. Chen, P. Wang, W. Ji, Y. Li, X. Wang, and J.-R. Wen, “RecBole: Towards a Unified, Comprehensive and Efficient Framework for Recommendation Algorithms,” in CIKM, 2021, pp. 4653–4664.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Peijie Sun (48 papers)
  2. Le Wu (47 papers)
  3. Kun Zhang (353 papers)
  4. Xiangzhi Chen (1 paper)
  5. Meng Wang (1065 papers)
Citations (16)

Summary

We haven't generated a summary for this paper yet.