Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash 99 tok/s
Gemini 2.5 Pro 48 tok/s Pro
GPT-5 Medium 40 tok/s
GPT-5 High 38 tok/s Pro
GPT-4o 101 tok/s
GPT OSS 120B 470 tok/s Pro
Kimi K2 161 tok/s Pro
2000 character limit reached

Laplacian Coarse Graining in Complex Networks (2302.07093v2)

Published 14 Feb 2023 in cond-mat.dis-nn and cond-mat.stat-mech

Abstract: Complex networks can model a range of different systems, from the human brain to social connections. Some of those networks have a large number of nodes and links, making it impractical to analyze them directly. One strategy to simplify these systems is by creating miniaturized versions of the networks that keep their main properties. A convenient tool that applies that strategy is the renormalization group (RG), a methodology used in statistical physics to change the scales of physical systems. This method consists of two steps: a coarse grain, where one reduces the size of the system, and a rescaling of the interactions to compensate for the information loss. This work applies RG to complex networks by introducing a coarse-graining method based on the Laplacian matrix. We use a field-theoretical approach to calculate the correlation function and coarse-grain the most correlated nodes into super-nodes, applying our method to several artificial and real-world networks. The results are promising, with most of the networks under analysis showing self-similar properties across different scales.

Citations (3)
List To Do Tasks Checklist Streamline Icon: https://streamlinehq.com

Collections

Sign up for free to add this paper to one or more collections.

Summary

We haven't generated a summary for this paper yet.

Ai Generate Text Spark Streamline Icon: https://streamlinehq.com

Paper Prompts

Sign up for free to create and run prompts on this paper using GPT-5.

Dice Question Streamline Icon: https://streamlinehq.com

Follow-up Questions

We haven't generated follow-up questions for this paper yet.