Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
134 tokens/sec
GPT-4o
9 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Efficient Learning of Balanced Signed Graphs via Sparse Linear Programming (2506.01826v1)

Published 2 Jun 2025 in cs.LG and eess.SP

Abstract: Signed graphs are equipped with both positive and negative edge weights, encoding pairwise correlations as well as anti-correlations in data. A balanced signed graph is a signed graph with no cycles containing an odd number of negative edges. Laplacian of a balanced signed graph has eigenvectors that map via a simple linear transform to ones in a corresponding positive graph Laplacian, thus enabling reuse of spectral filtering tools designed for positive graphs. We propose an efficient method to learn a balanced signed graph Laplacian directly from data. Specifically, extending a previous linear programming (LP) based sparse inverse covariance estimation method called CLIME, we formulate a new LP problem for each Laplacian column $i$, where the linear constraints restrict weight signs of edges stemming from node $i$, so that nodes of same / different polarities are connected by positive / negative edges. Towards optimal model selection, we derive a suitable CLIME parameter $\rho$ based on a combination of the Hannan-Quinn information criterion and a minimum feasibility criterion. We solve the LP problem efficiently by tailoring a sparse LP method based on ADMM. We theoretically prove local solution convergence of our proposed iterative algorithm. Extensive experimental results on synthetic and real-world datasets show that our balanced graph learning method outperforms competing methods and enables reuse of spectral filters, wavelets, and graph convolutional nets (GCN) constructed for positive graphs.

Summary

We haven't generated a summary for this paper yet.

X Twitter Logo Streamline Icon: https://streamlinehq.com