Multi-Sender Index Coding
- Multi-Sender Index Coding is a communication framework where distributed senders encode subsets of messages under side-information constraints to minimize broadcast cost.
- Graph-based models, including dependency digraphs, sender-constraint graphs, and hypergraphs, capture system complexity and guide optimal code design.
- MSIC techniques are applied in distributed storage, coded caching, and cooperative networks to enhance efficiency through joint encoding strategies.
Multi-Sender Index Coding (MSIC) refers to a structured communication paradigm wherein multiple distributed senders, each possessing only a subset of the global message set, must collaboratively encode transmissions for a population of receivers with diverse side-information. MSIC generalizes classical single-sender index coding and arises naturally in distributed storage, multi-hop networks, and cache-aided communication. The principal objective is to minimize the aggregate transmission cost (aggregate codelength or total broadcast rate) required to satisfy all receiver demands, subject to the constraints imposed by the sender-message assignments.
1. Formal Problem Statement
Given a message set where each , an ensemble of senders holds subsets , with . Each of receivers wants a specific message and possesses a (possibly overlapping) side-information set . Each sender applies an encoding function . The sum codelength is to be minimized, subject to the constraint that for each receiver there exists a decoding function such that . In the linear coding regime, these functions are linear, and constraints prohibit any sender from forming coded symbols mixing messages it does not hold.
A critical restriction in MSIC is that no coded symbol transmitted by any sender may involve messages not simultaneously available to that sender, thus introducing combinatorial coupling not present in the single-sender case (Thapa et al., 2016).
2. Graph-Theoretic and Hypergraph Models
MSIC is naturally modeled using layered graph-theoretic constructs:
- Side-information (dependency) digraph : Vertices represent message indices; directed arcs indicate that receiver knows .
- Sender-constraint graph : Undirected; an edge exists if and are not co-located at any sender.
- Message-graph : Edges connect message pairs co-located at some sender (Ong et al., 2013).
A recent development is the 4-uniform side-information hypergraph framework (Khalesi et al., 15 Dec 2025). Here, hyperedges encode not only traditional side-information and sender-message associations but also multi-sender "cross-cancellation" relationships that capture the joint decoding and interference-cancellation structure possible in MSIC.
Hyperedges are specifically of three types:
- Demand edges: ; receiver 's demand is present at sender .
- Cached side-information edges: ; receiver knows , which is present at .
- Coupled edges: ; is held by senders and and is not available as side-information at receiver .
This hypergraph admits a precise correspondence between sub-hypergraph "fittings" and valid linear index codes, and provides the foundation for the hyper-minrank characterization of MSIC.
3. Fundamental Bounds and Exact Achievability
The characterization of optimal codelength divides into lower and upper bounds of graph-theoretic or algebraic origin.
- Hyper-Minrank: The optimal scalar linear broadcast length equals the "hyper-minrank" of the side-information hypergraph:
where are block matrices determined by the fitting criterion, and the minimization is over block-matrix assignments satisfying receiver demand and sender-knowledge incidence conditions (Khalesi et al., 15 Dec 2025).
- Clique-Cover Upper Bound: This generalizes the clique-cover approach via hypercliques, yielding
with the minimum number of valid hypercliques covering all receivers.
- Complement Clique-Number Lower Bound: The broadcast length is lower bounded by the clique number of a carefully defined complement hypergraph, (Khalesi et al., 15 Dec 2025).
- Cycle-Cover and Partitioned Chromatic Bounds: In the two-sender case, and more broadly for -sender settings, the optimal rate is bounded above by variants of message-connected cycle-cover and (local-)chromatic cover numbers, where all combinatorial objects must respect the sender-constraint graph (Thapa et al., 2016).
- Pruning-Appending Lower Bound: In the uniprior multicast regime, iterated pruning and appending of arcs on SCCs in the side-information digraph, constrained by message-graph connectivity, produces a tight or near-tight lower bound. When message sets are disjoint among senders, the resulting bound is exact (Ong et al., 2013, Ong et al., 2014).
4. Decomposition and Subproblem Structure
A general structural insight in MSIC is that, under certain interaction patterns between senders' message sets (notably, when the sender-message partitions induce acyclic meta-graphs), the joint MSIC problem decomposes into a sum of single-sender index coding subproblems. Denoting the possible non-empty intersections among sender message sets as for , and their induced side-information subgraphs as , many cases satisfy
where is the optimal codelength for the subproblem on with messages exclusive to . For two-sender or three-sender scenarios, detailed interaction digraph classifications yield max-sum formulas or direct sum formulas depending on the strongly connected component structure and meta-graph acyclicity (Thapa et al., 2017, Arunachala et al., 2018, A. et al., 2018).
Further, the coding schemes—either in the weakly secure, groupcast, or multicast regime—can be constructed as block-diagonal or joint extensions of the codes for each subproblem, possibly employing interference alignment or codeword overlap when needed (Arunachala et al., 2019, A. et al., 2018).
5. Coding Schemes and Capacity Regions
Several algorithmic and coding-theoretic frameworks have been developed for MSIC:
- Linear Code Construction: Linear codes are constructed via fitting matrices whose block-partition respects sender message availabilities and receiver side-information; the optimal codelength then matches the minimum rank over all feasible such matrices (Kim et al., 2019). For unicast and groupcast, these may be achieved by block-diagonal concatenations of optimal single-sender codes, or by joint-extensions, with rank analysis determining optimality (A. et al., 2018).
- Cooperative Composite Coding (CCC): This approach introduces cooperative binning of composite messages among senders (i.e., shared indices for messages present at multiple senders), with correlated source coding (Slepian-Wolf) to exploit sender-link capacities more efficiently than independent non-cooperative binning. Rahimian and co-authors show CCC strictly enlarges the achievable rate region over partitioned distributed composite coding (DCC) and, with joint link-sender partitioning, matches the capacity region in previously unsolved cases (Li et al., 2017).
- Hyper-Minrank Algorithm: An exact exponential-time algorithm enumerates all possible "fittings" (receiver-to-sender edge assignments and supporting parity constraints), selecting the configuration minimizing the sum of individual sender codeword ranks. For certain sender-replication patterns, this approach outperforms heuristic rank minimization methods (Khalesi et al., 15 Dec 2025).
The broadcast rate region in MSIC is often non-convex and may require time-sharing between code structures (convex hull over achievable regions), especially under sender-link bandwidth asymmetry.
6. Structural and Algorithmic Complexity
The MSIC problem is computationally hard: chromatic number and clique-cover computations on union graphs (as required by clique-cover bounds) are NP-hard even for -partite graphs (Thapa et al., 2016). Hyper-minrank and fitting-matrix minimization schemes have exponential complexity in the number of messages and senders, but offer exact solutions in small or structurally favorable instances (Khalesi et al., 15 Dec 2025).
When message sets among senders are strictly disjoint, several lower and upper bounds coincide, and simple (greedy) cyclic or tree-based coding suffices to achieve optimality. In the general case, the presence of overlap, semi-connected or degenerate SCCs, and partial side-information participation necessitate fine-grained graph-theoretic or hypergraph-theoretic analysis to determine coding gains (Ong et al., 2013, Thapa et al., 2016, Ong et al., 2014).
7. Applications and Extensions
MSIC serves as the combinatorial substrate for a broader class of distributed information systems:
- Coded Caching: The delivery phase of multi-server coded caching maps onto MSIC, with sender constraints induced by cache placement (Khalesi et al., 15 Dec 2025).
- Coded Distributed Computation/Storage: MSIC models retrieval or repair schemes with helper nodes holding fragments or stripes, where efficient coding reduces network or disk traffic.
- Cooperative Wireless Networks and Edge/Satellite Systems: Multi-gateway downlink or joint transmission scenarios correspond to MSIC with cross-gateway signal cancellation articulated via coupled hyperedges.
In all these domains, hyper-minrank constitutes a unified design and analytic criterion for scalar linear communication cost, while clique-cover and fitting-matrix analysis provide practical heuristics and lower bound tools.
Key Citations:
- "Graph-Theoretic Approaches to Two-Sender Index Coding" (Thapa et al., 2016)
- "The Multi-Sender Multicast Index Coding" (Ong et al., 2013)
- "Hyper-Minrank: A Unified Hypergraph Characterization of Multi-Sender Index Coding" (Khalesi et al., 15 Dec 2025)
- "The Single-Uniprior Index-Coding Problem: The Single-Sender Case and The Multi-Sender Extension" (Ong et al., 2014)
- "Cooperative Multi-Sender Index Coding" (Li et al., 2017)
- "Structural Characteristics of Two-Sender Index Coding" (Thapa et al., 2017)
- "Linear Index Coding With Multiple Senders and Extension to a Cellular Network" (Kim et al., 2019)
- "Optimal Scalar Linear Codes for Some Classes of The Two-Sender Groupcast Index Coding Problem" (A. et al., 2018)
- "Optimal Linear Broadcast Rates of the Two-Sender Unicast Index Coding Problem with Fully-Participated Interactions" (Arunachala et al., 2018)
- "Optimal Weakly Secure Linear Codes for Some Classes of the Two-Sender Index Coding Problem" (Arunachala et al., 2019)