Papers
Topics
Authors
Recent
2000 character limit reached

Parallel Batch-Dynamic Maximal Matching with Constant Work per Update (2503.09908v1)

Published 12 Mar 2025 in cs.DS and cs.DC

Abstract: We present a work optimal algorithm for parallel fully batch-dynamic maximal matching against an oblivious adversary. In particular it processes batches of updates (either insertions or deletions of edges) in constant expected amortized work per edge update, and in $O(\log3 m)$ depth per batch whp, where $m$ is the maximum number of edges in the graph over time. This greatly improves on the recent result by Ghaffari and Trygub (2024) that requires $O(\log9 m)$ amortized work per update and $O(\log4 m )$ depth per batch, both whp. The algorithm can also be used for hyperedge maximal matching. For hypergraphs with rank $r$ (maximum cardinality of any edge) the algorithm supports batches of insertions and deletions with $O(r3)$ expected amortized work per edge update, and $O(\log3 m)$ depth per batch whp. This is a factor of $O(r)$ work off of the best sequential algorithm, Assadi and Solomon (2021), which uses $O(r2)$ work per update. Ghaffari and Trygub's parallel batch-dynamic algorithm on hypergraphs requires $O(r8 \log9 m)$ amortized work per edge update whp. We leverage ideas from the prior algorithms but introduce substantial new ideas. Furthermore, our algorithm is relatively simple, perhaps even simpler than the sequential hyperedge algorithm. We also present the first work-efficient algorithm for maximal matching on hypergraphs. For a hypergraph with total cardinality $m'$ (i.e., sum over the cardinality of each edge), the algorithm runs in $O(m')$ work in expectation and $O(\log2 m)$ depth whp. The algorithm also has some properties that allow us to use it as a subroutine in the dynamic algorithm to select random edges in the graph to add to the matching.

Summary

We haven't generated a summary for this paper yet.

Whiteboard

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.