Papers
Topics
Authors
Recent
Search
2000 character limit reached

Not All Negatives Are Worth Attending to: Meta-Bootstrapping Negative Sampling Framework for Link Prediction

Published 8 Dec 2023 in cs.LG | (2312.04815v2)

Abstract: The rapid development of graph neural networks (GNNs) encourages the rising of link prediction, achieving promising performance with various applications. Unfortunately, through a comprehensive analysis, we surprisingly find that current link predictors with dynamic negative samplers (DNSs) suffer from the migration phenomenon between "easy" and "hard" samples, which goes against the preference of DNS of choosing "hard" negatives, thus severely hindering capability. Towards this end, we propose the MeBNS framework, serving as a general plugin that can potentially improve current negative sampling based link predictors. In particular, we elaborately devise a Meta-learning Supported Teacher-student GNN (MST-GNN) that is not only built upon teacher-student architecture for alleviating the migration between "easy" and "hard" samples but also equipped with a meta learning based sample re-weighting module for helping the student GNN distinguish "hard" samples in a fine-grained manner. To effectively guide the learning of MST-GNN, we prepare a Structure enhanced Training Data Generator (STD-Generator) and an Uncertainty based Meta Data Collector (UMD-Collector) for supporting the teacher and student GNN, respectively. Extensive experiments show that the MeBNS achieves remarkable performance across six link prediction benchmark datasets.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (53)
  1. Structure aware negative sampling in knowledge graphs. In EMNLP.
  2. Hard negative mining for metric learning based zero-shot classification. In ECCV. 524–531.
  3. Liwei Cai and William Yang Wang. 2017. Kbgan: Adversarial learning for knowledge graph embeddings. In NAACL.
  4. Word2vec applied to recommendation: Hyperparameters matter. In RecSys. 352–356.
  5. Graph Neural Networks for Link Prediction with Subgraph Sketching. In ICLR.
  6. Fastgcn: fast learning with graph convolutional networks via importance sampling. In ICLR.
  7. Samwalker: Social recommendation with informative sampling strategy. In WWW. 228–239.
  8. Aditya Grover and Jure Leskovec. 2016. node2vec: Scalable feature learning for networks. In SIGKDD. 855–864.
  9. Guy Hacohen and Daphna Weinshall. 2019. On the power of curriculum learning in training deep networks. In ICML. 2535–2544.
  10. Inductive representation learning on large graphs. In NeurIPS.
  11. Lightgcn: Simplifying and powering graph convolution network for recommendation. In SIGIR. 639–648.
  12. Distilling the knowledge in a neural network. In NeurIPS.
  13. Adversarial learning on heterogeneous information networks. In SIGKDD. 120–129.
  14. Leveraging meta-path based context for top-n recommendation with a neural co-attention model. In SIGKDD. 1531–1540.
  15. Open graph benchmark: Datasets for machine learning on graphs. In NeurIPS. 22118–22133.
  16. Mixgcf: An improved training method for graph neural network-based recommender systems. In SIGKDD. 665–674.
  17. Few-shot link prediction via graph neural networks for covid-19 drug-repurposing. ICML (2020).
  18. Glen Jeh and Jennifer Widom. 2002. Simrank: a measure of structural-context similarity. In SIGKDD. 538–543.
  19. Hard negative mixing for contrastive learning. In NeurIPS. 21798–21809.
  20. Thomas N Kipf and Max Welling. 2016. Semi-supervised classification with graph convolutional networks. In ICLR.
  21. Link prediction techniques, applications, and performance: A survey. Physica A: Statistical Mechanics and its Applications (2020), 124289.
  22. Confidence May Cheat: Self-Training on Graph Neural Networks under Distribution Shift. In WWW. 1248–1258.
  23. Geniepath: Graph neural networks with adaptive receptive paths. In AAAI. 4424–4431.
  24. Distributed representations of words and phrases and their compositionality. In NeurIPS.
  25. Improving occlusion and hard negative handling for single-stage pedestrian detectors. In ECCV. 966–974.
  26. Deepwalk: Online learning of social representations. In SIGKDD. 701–710.
  27. Learning to reweight examples for robust deep learning. In ICML. 4334–4343.
  28. Steffen Rendle and Christoph Freudenthaler. 2014. Improving pairwise learning for item recommendation from implicit feedback. In WSDM. 273–282.
  29. Bayesian personalized ranking from implicit feedback. In UAI. 452–461.
  30. Pitfalls of graph neural network evaluation. arXiv preprint arXiv:1811.05868 (2018).
  31. Meta-weight-net: Learning an explicit mapping for sample weighting. In NeurIPS.
  32. Inductive relation prediction by subgraph reasoning. In ICML. 9448–9457.
  33. Graph attention networks. In ICLR. 20.
  34. SamWalker++: recommendation with informative sampling strategy. In IEEE Transactions on Knowledge and Data Engineering.
  35. Graphgan: graph representation learning with generative adversarial nets. AAAI (2017).
  36. Irgan: A minimax game for unifying generative and discriminative information retrieval models. In SIGIR. 515–524.
  37. Neural graph collaborative filtering. In SIGIR. 165–174.
  38. Reinforced negative sampling over knowledge graph for recommendation. In WWW. 99–109.
  39. A comprehensive survey on graph neural networks. IEEE transactions on neural networks and learning systems (2020), 4–24.
  40. Scaling attributed network embedding to massive graphs. In Proceedings of the VLDB Endowment.
  41. Inductive Link Prediction with Interactive Structure Learning on Attributed Graph. In ECML PKDD. 383–398.
  42. Financial risk analysis for SMEs with graph-based supply chain mining. In IJCAI. 4661–4667.
  43. Revisiting semi-supervised learning with graph embeddings. In ICML. 40–48.
  44. Understanding negative sampling in graph representation learning. In SIGKDD. 1666–1676.
  45. Region or Global A Principle for Negative Sampling in Graph-based Recommendation. IEEE Transactions on Knowledge and Data Engineering (2023).
  46. A gift from knowledge distillation: Fast optimization, network minimization and transfer learning. In CVPR. 4133–4141.
  47. Graph convolutional neural networks for web-scale recommender systems. In SIGKDD. 974–983.
  48. Neo-gnns: Neighborhood overlap-aware graph neural networks for link prediction. NeurIPS, 13683–13694.
  49. Commonsense Knowledge Graph towards Super APP and Its Applications in Alipay. In SIGKDD. 5509–5519.
  50. Muhan Zhang and Yixin Chen. 2018. Link prediction based on graph neural networks. In NeurIPS.
  51. Optimizing top-n collaborative filtering via dynamic negative item sampling. In SIGIR. 785–788.
  52. Learning from counterfactual links for link prediction. In ICML. 26911–26926.
  53. Neural bellman-ford networks: A general graph neural network framework for link prediction. In NeurIPS. 29476–29490.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.