Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Disentangled Contrastive Collaborative Filtering (2305.02759v4)

Published 4 May 2023 in cs.IR and cs.AI

Abstract: Recent studies show that graph neural networks (GNNs) are prevalent to model high-order relationships for collaborative filtering (CF). Towards this research line, graph contrastive learning (GCL) has exhibited powerful performance in addressing the supervision label shortage issue by learning augmented user and item representations. While many of them show their effectiveness, two key questions still remain unexplored: i) Most existing GCL-based CF models are still limited by ignoring the fact that user-item interaction behaviors are often driven by diverse latent intent factors (e.g., shopping for family party, preferred color or brand of products); ii) Their introduced non-adaptive augmentation techniques are vulnerable to noisy information, which raises concerns about the model's robustness and the risk of incorporating misleading self-supervised signals. In light of these limitations, we propose a Disentangled Contrastive Collaborative Filtering framework (DCCF) to realize intent disentanglement with self-supervised augmentation in an adaptive fashion. With the learned disentangled representations with global context, our DCCF is able to not only distill finer-grained latent factors from the entangled self-supervision signals but also alleviate the augmentation-induced noise. Finally, the cross-view contrastive learning task is introduced to enable adaptive augmentation with our parameterized interaction mask generator. Experiments on various public datasets demonstrate the superiority of our method compared to existing solutions. Our model implementation is released at the link https://github.com/HKUDS/DCCF.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (51)
  1. Shoshana Abramovich and Lars-Erik Persson. 2016. Some new estimates of the ’Jensen gap’. Journal of Inequalities and Applications 2016, 1 (2016), 1–9.
  2. LightGCL: Simple Yet Effective Graph Contrastive Learning for Recommendation. In ICLR.
  3. Sequential recommendation with graph neural networks. In SIGIR. 378–387.
  4. Jointly non-sampling learning for knowledge graph enhanced recommendation. In SIGIR. 189–198.
  5. Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In AAAI, Vol. 34. 3438–3445.
  6. Curriculum Disentangled Recommendation with Noisy Multi-feedback. NeurIPS 34, 26924–26936.
  7. Revisiting Graph Based Collaborative Filtering: A Linear Residual Graph Convolutional Network Approach. In AAAI, Vol. 34. 27–34.
  8. Heterogeneous Graph Contrastive Learning for Recommendation. In WSDM. 544–552.
  9. Intent contrastive learning for sequential recommendation. In WWW. 2172–2182.
  10. Curriculum meta-learning for next POI recommendation. In KDD. 2692–2702.
  11. Iterative deep graph learning for graph neural networks: Better and robust node embeddings. NeurIPS (2020), 19314–19326.
  12. Bounds on the Jensen gap, and implications for mean-concentrated distributions. AJMAA 16, 14 (2019), 1–16.
  13. Lightgcn: Simplifying and powering graph convolution network for recommendation. In SIGIR. 639–648.
  14. Neural collaborative filtering. In WWW. 173–182.
  15. Diederik P. Kingma and Jimmy Ba. 2015. Adam: A Method for Stochastic Optimization. In ICLR.
  16. Disentangled contrastive learning on graphs. NeurIPS 34 (2021), 21872–21884.
  17. Improving Graph Collaborative Filtering with Neighborhood-enriched Contrastive Learning. In WWW. 2320–2329.
  18. Learning to drop: Robust graph neural network via topological denoising. In WSDM. 779–787.
  19. Disentangled graph convolutional networks. In ICML. PMLR, 4212–4221.
  20. Learning disentangled representations for recommendation. In NeurIPS. 5711–5722.
  21. Knowledge-Guided Disentangled Representation Learning for Recommender Systems. Transactions on Information Systems (TOIS) 40, 1 (2021), 1–26.
  22. Graph representation learning via graphical mutual information maximization. In WWW. 259–270.
  23. Autorec: Autoencoders meet collaborative filtering. In WWW. 111–112.
  24. Session-based social recommendation via dynamic graph attention networks. In WSDM. 555–563.
  25. HGCF: Hyperbolic Graph Convolution Networks for Collaborative Filtering. In WWW. 593–601.
  26. Learning to Denoise Unreliable Interactions for Graph Collaborative Filtering. In SIGIR. 122–132.
  27. Laurens Van der Maaten and Geoffrey Hinton. 2008. Visualizing data using t-SNE. Journal of machine learning research 9, 11 (2008).
  28. Deep Graph Infomax.. In ICLR.
  29. Visual commonsense r-cnn. In CVPR. 10760–10770.
  30. Deconfounded recommendation for alleviating bias amplification. In KDD. 1717–1725.
  31. Kgat: Knowledge graph attention network for recommendation. In KDD. 950–958.
  32. Neural Graph Collaborative Filtering. In SIGIR.
  33. Learning intents behind interactions with knowledge graph for recommendation. In WWW. 878–887.
  34. Disentangled graph collaborative filtering. In SIGIR. 1001–1010.
  35. Multi-component graph convolutional collaborative filtering. In AAAI, Vol. 34. 6267–6274.
  36. Disenhan: Disentangled heterogeneous graph attention network for recommendation. In CIKM. 1605–1614.
  37. Profiling the Design Space for Graph Neural Networks based Collaborative Filtering. In WSDM. 1109–1119.
  38. Contrastive meta learning with behavior multiplicity for recommendation. In WSDM. 1120–1128.
  39. Self-supervised graph learning for recommendation. In SIGIR. 726–735.
  40. Graph neural networks in recommender systems: a survey. ACM Computing Surveys (CSUR) (2020).
  41. Automated Self-Supervised Learning for Recommendation. In WWW. 992–1002.
  42. Hypergraph contrastive collaborative filtering. In SIGIR. 70–79.
  43. Multi-behavior hypergraph-enhanced transformer for sequential recommendation. In KDD. 2263–2274.
  44. Enhanced graph learning for collaborative filtering via mutual information maximization. In SIGIR. 71–80.
  45. Self-supervised Learning for Large-scale Item Recommendations. In CIKM. 4321–4330.
  46. Self-Supervised Multi-Channel Hypergraph Convolutional Network for Social Recommendation. In WWW. 413–424.
  47. Re4: Learning to Re-contrast, Re-attend, Re-construct for Multi-interest Recommendation. In WWW. 2216–2226.
  48. Multi-view intent disentangle graph networks for bundle recommendation. AAAI (2022).
  49. S3-rec: Self-supervised learning for sequential recommendation with mutual information maximization. In CIKM. 1893–1902.
  50. Yaochen Zhu and Zhenzhong Chen. 2022. Mutually-regularized dual collaborative variational auto-encoder for recommendation systems. In WWW. 2379–2387.
  51. Improving knowledge-aware recommendation with multi-level interactive contrastive learning. In CIKM. 2817–2826.
User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (5)
  1. Xubin Ren (17 papers)
  2. Lianghao Xia (65 papers)
  3. Jiashu Zhao (13 papers)
  4. Dawei Yin (165 papers)
  5. Chao Huang (244 papers)
Citations (46)

Summary

We haven't generated a summary for this paper yet.

Github Logo Streamline Icon: https://streamlinehq.com