A Distributed Algorithm for Solving Linear Algebraic Equations Over Random Networks
Abstract: In this paper, we consider the problem of solving linear algebraic equations of the form $Ax=b$ among multi agents which seek a solution by using local information in presence of random communication topologies. The equation is solved by $m$ agents where each agent only knows a subset of rows of the partitioned matrix $[A,b]$. We formulate the problem such that this formulation does not need the distribution of random interconnection graphs. Therefore, this framework includes asynchronous updates or unreliable communication protocols without B-connectivity assumption. We apply the random Krasnoselskii-Mann iterative algorithm which converges almost surely and in mean square to a solution of the problem for any matrices $A$ and $b$ and any initial conditions of agents' states. We demonestrate that the limit point to which the agents' states converge is determined by the unique solution of a convex optimization problem regardless of the distribution of random communication graphs. Eventually, we show by two numerical examples that the rate of convergence of the algorithm cannot be guaranteed.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.