Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
119 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Learning Large-scale Network Embedding from Representative Subgraph (2112.01442v1)

Published 2 Dec 2021 in cs.SI and cs.AI

Abstract: We study the problem of large-scale network embedding, which aims to learn low-dimensional latent representations for network mining applications. Recent research in the field of network embedding has led to significant progress such as DeepWalk, LINE, NetMF, NetSMF. However, the huge size of many real-world networks makes it computationally expensive to learn network embedding from the entire network. In this work, we present a novel network embedding method called "NES", which learns network embedding from a small representative subgraph. NES leverages theories from graph sampling to efficiently construct representative subgraph with smaller size which can be used to make inferences about the full network, enabling significantly improved efficiency in embedding learning. Then, NES computes the network embedding from this representative subgraph, efficiently. Compared with well-known methods, extensive experiments on networks of various scales and types demonstrate that NES achieves comparable performance and significant efficiency superiority.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (9)
  1. Junsheng Kong (2 papers)
  2. Weizhao Li (2 papers)
  3. Ben Liao (7 papers)
  4. Jiezhong Qiu (29 papers)
  5. Chang-Yu (1 paper)
  6. Hsieh (3 papers)
  7. Yi Cai (83 papers)
  8. Jinhui Zhu (6 papers)
  9. Shengyu Zhang (160 papers)
Citations (1)

Summary

We haven't generated a summary for this paper yet.