Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

SsAG: Summarization and sparsification of Attributed Graphs (2109.15111v3)

Published 30 Sep 2021 in cs.DM

Abstract: We present SsAG, an efficient and scalable lossy graph summarization method that retains the essential structure of the original graph. SsAG computes a sparse representation (summary) of the input graph and also caters to graphs with node attributes. The summary of a graph $G$ is stored as a graph on supernodes (subsets of vertices of $G$), and a weighted superedge connects two supernodes. The proposed method constructs a summary graph on $k$ supernodes that minimize the reconstruction error (difference between the original graph and the graph reconstructed from the summary) and maximum homogeneity with respect to attributes. We construct the summary by iteratively merging a pair of nodes. We derive a closed-form expression to efficiently compute the reconstruction error after merging a pair and approximate this score in constant time. To reduce the search space for selecting the best pair for merging, we assign a weight to each supernode that closely quantifies the contribution of the node in the score of the pairs containing it. We choose the best pair for merging from a random sample of supernodes selected with probability proportional to their weights. A logarithmic-sized sample yields a comparable summary based on various quality measures with weighted sampling. We propose a sparsification step for the constructed summary to reduce the storage cost to a given target size with a marginal increase in reconstruction error. Empirical evaluation on several real-world graphs and comparison with state-of-the-art methods shows that SsAG is up to $5\times$ faster and generates summaries of comparable quality.

Citations (2)

Summary

We haven't generated a summary for this paper yet.