Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 134 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 23 tok/s Pro
GPT-5 High 29 tok/s Pro
GPT-4o 79 tok/s Pro
Kimi K2 188 tok/s Pro
GPT OSS 120B 434 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Few-shot Learning on Heterogeneous Graphs: Challenges, Progress, and Prospects (2403.13834v1)

Published 10 Mar 2024 in cs.LG

Abstract: Few-shot learning on heterogeneous graphs (FLHG) is attracting more attention from both academia and industry because prevailing studies on heterogeneous graphs often suffer from label sparsity. FLHG aims to tackle the performance degradation in the face of limited annotated data and there have been numerous recent studies proposing various methods and applications. In this paper, we provide a comprehensive review of existing FLHG methods, covering challenges, research progress, and future prospects. Specifically, we first formalize FLHG and categorize its methods into three types: single-heterogeneity FLHG, dual-heterogeneity FLHG, and multi-heterogeneity FLHG. Then, we analyze the research progress within each category, highlighting the most recent and representative developments. Finally, we identify and discuss promising directions for future research in FLHG. To the best of our knowledge, this paper is the first systematic and comprehensive review of FLHG.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (54)
  1. Learning to learn by gradient descent by gradient descent. NIPS, 29, 2016.
  2. Heterogeneous graph neural networks analysis: a survey of techniques, evaluations and applications. Artificial Intelligence Review, 56(8):8003–8042, 2023.
  3. Meta-knowledge transfer for inductive knowledge graph embedding. In SIGIR, pages 927–937, 2022.
  4. Few-shot network anomaly detection via cross-network meta-learning. In WWW, pages 2448–2456, 2021.
  5. Cross-heterogeneity graph few-shot learning. In CIKM, pages 420–429, 2023.
  6. Few-shot semantic relation prediction across heterogeneous graphs. IEEE TKDE, 35(10):10265–10280, 2023.
  7. Few-shot causal representation learning for out-of-distribution generalization on heterogeneous graphs. arXiv preprint arXiv:2401.03597, 2024.
  8. Link prediction and recommendation across heterogeneous social networks. In ICDM, pages 181–190, 2012.
  9. metapath2vec: Scalable representation learning for heterogeneous networks. In KDD, pages 135–144, 2017.
  10. Heterogeneous network representation learning. In IJCAI, volume 20, pages 4861–4867, 2020.
  11. Model-agnostic meta-learning for fast adaptation of deep networks. In ICML, pages 1126–1135, 2017.
  12. Magnn: Metapath aggregated graph neural network for heterogeneous graph embedding. In WWW, pages 2331–2341, 2020.
  13. Few-shot graph learning for molecular property prediction. In WWW, pages 2559–2567, 2021.
  14. Heterogeneous graph prototypical networks for few-shot node classification. In ICONIP, pages 540–555, 2023.
  15. An attention-based graph neural network for heterogeneous structural learning. In AAAI, volume 34, pages 4132–4139, 2020.
  16. Disentangled representation learning in heterogeneous information network for large-scale android malware detection in the covid-19 era and beyond. In AAAI, volume 35, pages 7754–7761, 2021.
  17. Heterogeneous graph transformer. In WWW, pages 2704–2710, 2020.
  18. Challenges and applications of large language models. arXiv preprint arXiv:2307.10169, 2023.
  19. Siamese neural networks for one-shot image recognition. In ICML workshop, volume 2, 2015.
  20. Graph pooling for graph neural networks: Progress, challenges, and opportunities. arXiv preprint arXiv:2204.07321, 2022.
  21. A survey on heterogeneous information network based recommender systems: Concepts, methods, applications and resources. AI Open, 3:40–57, 2022.
  22. Meta-learning on heterogeneous information networks for cold-start recommendation. In KDD, pages 1563–1573, 2020.
  23. Adapting meta knowledge graph information for multi-hop reasoning over few-shot relations. arXiv preprint arXiv:1908.11513, 2019.
  24. Meta learning with graph attention networks for low-data drug discovery. TNNLS, 2023.
  25. Single-cell biological network inference using a heterogeneous graph transformer. Nature Communications, 14(1):964, 2023.
  26. Metalearning with graph neural networks: Methods and applications. ACM SIGKDD Explorations Newsletter, 23(2):13–22, 2022.
  27. Meta-learning with motif-based task augmentation for few-shot molecular property prediction. In SDM, pages 811–819, 2023.
  28. Judea Pearl. Causality. Cambridge university press, 2009.
  29. Meta-learning with latent embedding optimization. arXiv preprint arXiv:1807.05960, 2018.
  30. Prototypical networks for few-shot learning. NIPS, 30, 2017.
  31. Causal attention for interpretable and generalizable graph classification. In KDD, pages 1696–1705, 2022.
  32. Pathsim: Meta path-based top-k similarity search in heterogeneous information networks. PVLDB, 4(11):992–1003, 2011.
  33. Graph attention networks. arXiv preprint arXiv:1710.10903, 2017.
  34. Matching networks for one shot learning. NIPS, 29, 2016.
  35. Heterogeneous graph attention network. In WWW, pages 2022–2032, 2019.
  36. Disenhan: Disentangled heterogeneous graph attention network for recommendation. In CIKM, pages 1605–1614, 2020.
  37. Reform: Error-aware few-shot knowledge graph completion. In CIKM, pages 1979–1988, 2021.
  38. Self-supervised heterogeneous graph neural network with co-contrastive learning. In KDD, pages 1726–1736, 2021.
  39. Property-aware relation networks for few-shot molecular property prediction. NIPS, 34:17441–17454, 2021.
  40. A survey on heterogeneous graph embedding: methods, techniques, applications and sources. IEEE Transactions on Big Data, 9(2):415–436, 2022.
  41. Heterogeneous graph neural networks for noisy few-shot relation classification. Knowledge-Based Systems, 194:105548, 2020.
  42. One-shot relational learning for knowledge graphs. arXiv preprint arXiv:1808.09040, 2018.
  43. Robustness of deep learning models on graphs: A survey. AI Open, 2:69–78, 2021.
  44. Simple and efficient heterogeneous graph neural network. In AAAI, volume 37, pages 10816–10824, 2023.
  45. Zero-shot transfer learning within a heterogeneous graph via knowledge transfer networks. In NIPS, pages 27347–27359, 2022.
  46. Gnnguard: Defending graph neural networks against adversarial attacks. NIPS, 33:9263–9275, 2020.
  47. Few-shot knowledge graph completion. In AAAI, volume 34, pages 3041–3048, 2020.
  48. Few-shot multi-hop relation reasoning over knowledge bases. In Findings of EMNLP 2020, 2020.
  49. Few-shot learning on graphs: A survey. arXiv preprint arXiv:2203.09308, 2022.
  50. Few-shot heterogeneous graph learning via cross-domain knowledge transfer. In KDD, pages 2450–2460, 2022.
  51. Hg-meta: Graph meta-learning over heterogeneous graphs. In SDM, pages 397–405, 2022.
  52. Subgraph-aware few-shot inductive link prediction via meta-learning. IEEE TKDE, 2022.
  53. Relation structure-aware heterogeneous graph neural network. In ICDM, pages 1534–1539, 2019.
  54. Hinfshot: A challenge dataset for few-shot node classification in heterogeneous information network. In ICMR, pages 429–436, 2021.
Citations (1)

Summary

We haven't generated a summary for this paper yet.

Dice Question Streamline Icon: https://streamlinehq.com

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Lightbulb Streamline Icon: https://streamlinehq.com

Continue Learning

We haven't generated follow-up questions for this paper yet.

List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

X Twitter Logo Streamline Icon: https://streamlinehq.com

Tweets

This paper has been mentioned in 2 tweets and received 0 likes.

Upgrade to Pro to view all of the tweets about this paper: