Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
110 tokens/sec
GPT-4o
56 tokens/sec
Gemini 2.5 Pro Pro
44 tokens/sec
o3 Pro
6 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Boosting Parallel Influence-Maximization Kernels for Undirected Networks with Fusing and Vectorization (2008.03095v1)

Published 7 Aug 2020 in cs.DC and cs.SI

Abstract: Influence maximization (IM) is the problem of finding a seed vertex set which is expected to incur the maximum influence spread on a graph. It has various applications in practice such as devising an effective and efficient approach to disseminate information, news or ad within a social network. The problem is shown to be NP-hard and approximation algorithms with provable quality guarantees exist in the literature. However, these algorithms are computationally expensive even for medium-scaled graphs. Furthermore, graph algorithms usually suffer from spatial and temporal irregularities during memory accesses, and this adds an extra cost on top of the already expensive IM kernels. In this work, we leverage fused sampling, memoization, and vectorization to restructure, parallelize and boost their performance on undirected networks. The proposed approach employs a pseudo-random function and performs multiple Monte-Carlo simulations in parallel to exploit the SIMD lanes effectively and efficiently. Besides, it significantly reduces the number of edge traversals, hence the amount of data brought from the memory, which is critical for almost all memory-bound graph kernels. We apply the proposed approach to the traditional MixGreedy algorithm and propose Infuser which is more than 3000 times faster than the traditional greedy approaches and can run on large graphs that have been considered as too large in the literature.

User Edit Pencil Streamline Icon: https://streamlinehq.com
Authors (2)
  1. Gokhan Gokturk (46 papers)
  2. Kamer Kaya (26 papers)
Citations (10)

Summary

We haven't generated a summary for this paper yet.