Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
97 tokens/sec
GPT-4o
53 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Fast and Error-Adaptive Influence Maximization based on Count-Distinct Sketches (2105.04023v1)

Published 9 May 2021 in cs.SI and cs.DS

Abstract: Influence maximization (IM) is the problem of finding a seed vertex set that maximizes the expected number of vertices influenced under a given diffusion model. Due to the NP-Hardness of finding an optimal seed set, approximation algorithms are frequently used for IM. In this work, we describe a fast, error-adaptive approach that leverages Count-Distinct sketches and hash-based fused sampling. To estimate the number of influenced vertices throughout a diffusion, we use per-vertex Flajolet-Martin sketches where each sketch corresponds to a sampled subgraph. To efficiently simulate the diffusions, the reach-set cardinalities of a single vertex are stored in memory in a consecutive fashion. This allows the proposed algorithm to estimate the number of influenced vertices in a single step for simulations at once. For a faster IM kernel, we rebuild the sketches in parallel only after observing estimation errors above a given threshold. Our experimental results show that the proposed algorithm yields high-quality seed sets while being up to 119x faster than a state-of-the-art approximation algorithm. In addition, it is up to 62x faster than a sketch-based approach while producing seed sets with 3%-12% better influence scores

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Gokhan Gokturk (46 papers)
  2. Kamer Kaya (26 papers)
Citations (3)

Summary

We haven't generated a summary for this paper yet.