Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
125 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
42 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Network Robustness via Global k-cores (2012.10036v1)

Published 18 Dec 2020 in cs.SI

Abstract: Network robustness is a measure a network's ability to survive adversarial attacks. But not all parts of a network are equal. K-cores, which are dense subgraphs, are known to capture some of the key properties of many real-life networks. Therefore, previous work has attempted to model network robustness via the stability of its k-core. However, these approaches account for a single core value and thus fail to encode a global network resilience measure. In this paper, we address this limitation by proposing a novel notion of network resilience that is defined over all cores. In particular, we evaluate the stability of the network under node removals with respect to each node's initial core. Our goal is to compute robustness via a combinatorial problem: find b most critical nodes to delete such that the number of nodes that fall from their initial cores is maximized. One of our contributions is showing that it is NP-hard to achieve any polynomial factor approximation of the given objective. We also present a fine-grained complexity analysis of this problem under the lens of parameterized complexity theory for several natural parameters. Moreover, we show two applications of our notion of robustness: measuring the evolution of species and characterizing networks arising from different domains.

Citations (7)

Summary

We haven't generated a summary for this paper yet.