Papers
Topics
Authors
Recent
2000 character limit reached

Subadditive and Succinct Valuations

Updated 13 December 2025
  • Subadditive and succinct valuations are valuation functions defined on item subsets with polynomial-size representations, essential for efficient welfare maximization in auctions.
  • The analysis reveals how structural properties and limited communication protocols achieve tight approximation ratios, especially when succinct bidders are present.
  • Methodological models, including number-in-hand communication and query-based learning, illustrate the impact of representation constraints on algorithmic mechanism design.

Subadditive and succinct valuations are central tools in combinatorial optimization, auction theory, and algorithmic mechanism design. The paper of their structural properties, communication complexity, and learnability elucidates both the fundamental limitations and achievable approximations in welfare maximization and preference learning.

1. Core Definitions and Valuation Hierarchy

Let M=[m]M = [m] denote a finite ground set of items, and N=[n]N = [n] a set of agents (bidders). A (monotone) valuation function v:2MR+v:2^{M} \to \mathbb{R}_{+} assigns a non-negative value to each set of items, satisfying monotonicity: v(S)v(T)v(S) \leq v(T) for STS \subseteq T and v()=0v(\emptyset)=0.

Subadditive (SA) valuations: v(ST)v(S)+v(T)v(S\cup T) \leq v(S) + v(T) for all S,TMS,T \subseteq M.

XOS (Fractionally Subadditive) valuations: v(S)=max=1kjSwjv(S) = \max_{\ell=1}^k \sum_{j \in S} w_j^{\ell}, where each wR+mw^{\ell} \in \mathbb{R}_{+}^{m}. Every XOS function is subadditive.

Additive valuations: v(S)=jSwjv(S) = \sum_{j \in S} w_j for some wj0w_j \geq 0.

Single-minded (SM) valuations: Exists TMT \subseteq M and w0w \geq 0 such that v(S)=wv(S) = w if STS \supseteq T and $0$ otherwise.

Succinct (SC) valuations: Any valuation that can be fully described using poly(m)\mathrm{poly}(m) bits, such as additive or single-minded valuations.

The standard expressiveness hierarchy is:

OXSGrossSubstitutesSubmodularXOSSubadditive\mathrm{OXS} \subsetneq \mathrm{GrossSubstitutes} \subsetneq \mathrm{Submodular} \subsetneq \mathrm{XOS} \subsetneq \mathrm{Subadditive}

Every submodular function admits an (exponential-sized) XOS representation (Balcan et al., 2011).

2. Succinctness and Representation Complexity

Arbitrary subadditive or XOS valuations require exponential representation size in mm, in the worst case. Succinct representations are those where the encoding (number of XOS additive clauses kk and their bit-precision) is polynomial in mm.

  • Succinct XOS: k=poly(m)k = \mathrm{poly}(m) and weights have poly(m)\mathrm{poly}(m) bits.
  • Succinct subadditive: A subset of subadditive valuations representable via polynomially-bounded description length.

Succinctness permits tractability in both communication and learning, enabling polynomial-time algorithms otherwise infeasible for general (exponentially large) valuations (Qiu et al., 6 Dec 2025, Balcan et al., 2011).

3. Communication Complexity in Welfare Maximization

Consider multi-bidder combinatorial auctions, where each agent's valuation is private and may be from SA, XOS, or SC. The primary computational question is to approximate the optimal allocation max(A1,,An+c)ivi(Ai)\max_{(A_1,\ldots,A_{n+c})} \sum_i v_i(A_i) within poly(m)\mathrm{poly}(m) communication.

For SA \cup SC (subadditive and succinct bidders):

  • A polynomial-communication (32/n)(3-2/n)-approximation is achievable. The protocol pools all SC valuations, lets each SA bidder optimize jointly with the SC pool, and selects the best outcome or a $2$-approximation for SA alone (Qiu et al., 6 Dec 2025).
  • There is a matching $3$-hardness: For large nn, any (3ϵ)(3-\epsilon)-approximation requires exponential communication (in either the number of succinct bidders cc or in m/n3/2\sqrt{m}/n^{3/2}). Thus, the approximation barrier rises from $2$ for plain SA to $3$ when even a single SC is added.

For XOS \cup SC:

  • A polynomial-communication $2$-approximation via configuration LP and Online Contention Resolution Scheme (OCRS).
  • The $2$-hardness barrier matches: no (2ϵ)(2-\epsilon)-approximation with poly communication as nn\to\infty.

Separation persists for fixed nn: For SA \cup SM, achieving $2.06$-approximation with communication polynomial in mm is already impossible for all n3n \geq 3. Similarly, for XOS \cup SC, inapproximability exceeds the plain XOS threshold as soon as even one succinct bidder is introduced (Qiu et al., 6 Dec 2025).

A summary of optimal approximation ratios is as follows:

Valuation Class Optimal Poly-Com. Ratio Hardness Barrier
SA 2 $2-o(1)$
SA \cup SC 3 $3-o(1)$
XOS 1/(1(11/n)n)e/(e1)1/(1-(1-1/n)^n) \le e/(e-1) 1/(1(11/n)n)o(1)1/(1-(1-1/n)^n)-o(1)
XOS \cup SC 2 $2-o(1)$

Key implication: Any addition of succinct bidders elevates the hardness and achievable approximation ratio in communication-constrained settings (Qiu et al., 6 Dec 2025).

4. Learnability and Sample Complexity

Subadditive and XOS valuations, due to their generality and non-linearity, exhibit intrinsic barriers in learning.

In the distributional (PMAC) model, for general subadditive or XOS:

  • Any algorithm requires an approximation factor O~(n)\tilde{O}(\sqrt{n}) (XOS) and O(nlogn)O(\sqrt{n} \log n) (subadditive) with polynomially many samples/queries.
  • No algorithm (even with value or price queries) can improve beyond o(n/logn)o(\sqrt{n}/\log n).
  • The heart of these results is a structural lemma: every XOS ff is sandwiched between w(S)\sqrt{w(S)} and nw(S)\sqrt{n} \sqrt{w(S)} for some additive ww; for subadditive ff an extra log factor applies (Balcan et al., 2011).

For succinct (poly-size) XOS:

  • For XOS with representation size k=poly(n)k=\mathrm{poly}(n), for any η>0\eta>0, PMAC-learning to factor nηn^{\eta} is possible in time nO(1/η)n^{O(1/\eta)}.
  • For OXS/XOS with RR leaves (or trees), PMAC-learnability to factor RR is achievable; in fact, if R=O(1)R=O(1), exact PAC-learning is possible (Balcan et al., 2011).

Thus, learnability barriers from expressiveness can be circumvented by structural constraints:

  • General SA/XOS: Θ~(n)\tilde{\Theta}(\sqrt{n}) barrier.
  • Succinct SA/XOS: Arbitrarily small (polynomial) approximation factors, depending on representation size.

5. Structural and Algorithmic Insights

The transition from exponential to succinct representations fundamentally alters both communication and computational complexity. Several structural results facilitate improved algorithms:

  • Every XOS admits an additive approximation such that w(S)v(S)nw(S)\sqrt{w(S)} \leq v(S) \leq \sqrt{n} \sqrt{w(S)}.
  • Any SA valuation can be lnn\ln n-approximated by an XOS function.
  • For XOS with at most RR additive-clauses, raising clause sums to the LLth power and averaging allows polynomial-time learning in high-dimensional feature space.

These lemmas connect combinatorial valuations to linear threshold functions and polynomials, forming the backbone of both learning theory and approximation protocols.

6. Methodological Models: Communication and Query Types

Two interaction models dominate the paper of these classes:

  • Number-in-hand communication: Each agent knows only its private valuation and communicates using a shared blackboard. Poly-communication protocols are required for tractability (Qiu et al., 6 Dec 2025).
  • Query-based learning: Value queries (v(S)v(S) for any SS) and price queries (binary response to a posted price for SS). PMAC learning bounds transfer between these models with logarithmic overhead in sample complexity (Balcan et al., 2011).

The feasibility of information exchange directly determines achievable welfare or learning accuracy.

7. Open Problems and Structural Directions

Key unresolved directions include:

  1. Closing poly-logarithmic gaps in learnability for the subadditive class.
  2. Determining sharper bounds for submodular learning (current best between n1/3n^{1/3} and n\sqrt{n}).
  3. Identifying further valuation subclasses (e.g., budget-additive, coverage) with superior structural approximability.

The interplay between succinctness, approximation, and communication remains a focal point for advances in algorithmic mechanism design and learning theory (Qiu et al., 6 Dec 2025, Balcan et al., 2011).

Definition Search Book Streamline Icon: https://streamlinehq.com
References (2)

Whiteboard

Follow Topic

Get notified by email when new papers are published related to Subadditive and Succinct Valuations.