Papers
Topics
Authors
Recent
Assistant
AI Research Assistant
Well-researched responses based on relevant abstracts and paper content.
Custom Instructions Pro
Preferences or requirements that you'd like Emergent Mind to consider when generating responses.
Gemini 2.5 Flash
Gemini 2.5 Flash 134 tok/s
Gemini 2.5 Pro 41 tok/s Pro
GPT-5 Medium 30 tok/s Pro
GPT-5 High 37 tok/s Pro
GPT-4o 98 tok/s Pro
Kimi K2 195 tok/s Pro
GPT OSS 120B 442 tok/s Pro
Claude Sonnet 4.5 37 tok/s Pro
2000 character limit reached

Column-and-Constraint Generation (CCG)

Updated 20 October 2025
  • Column-and-Constraint Generation (CCG) is a hybrid decomposition method combining mathematical programming and constraint programming to solve large-scale discrete optimization problems.
  • It iteratively adds decision variables and constraints based on subproblem solutions, ensuring feasibility and finite convergence through adaptive master problem updates.
  • CCG is applied in various fields such as software testing, robust optimization, machine learning, and logistics, with recent advances integrating neural and reinforcement learning techniques.

The Column-and-Constraint Generation (CCG) algorithm is a hybrid decomposition technique for solving large-scale discrete optimization problems, especially two-stage adaptive and stochastic optimization, robust combinatorial structures, and set covering variants. CCG iteratively and dynamically generates both decision variables (“columns”) and constraints, often using mathematical programming (MP) for global optimization structure and constraint programming (CP) for logical and combinatorial subproblems. Its variants provide guarantees for solution feasibility, finite convergence, and strong computational efficiency across diverse applications, including combinatorial software testing, adaptive optimization, machine learning, signal recovery, clustering, logistics, network scheduling, bin packing, and power system operations.

1. Mathematical Programming Master Problem Formulation

At the core of CCG is the decomposition of the original problem into a master problem and one or more subproblems. The master problem is typically formulated as a set covering integer program or, for continuous recourse, as a two-stage adaptive linear optimization, e.g. minimize tTctxt\sum_{t\in T} c_t x_t subject to %%%%1%%%% for pPp \in P; xt{0,1}x_t\in\{0,1\}; ct=1c_t=1. In two-stage robust or stochastic optimization, the master problem involves a relaxation over the current set of scenarios, seeking the optimal first-stage decision xx under cuts or columns added from previously identified subproblem solutions (Bertsimas et al., 2018, Tsang et al., 2022, Zhang et al., 30 May 2025, Shao et al., 14 Aug 2025):

minxX, δ cTx+δ\min_{x\in X,~\delta}~ c^T x + \delta

subject to

δQ(x,ξ)  ξS\delta \ge Q(x,\xi)~\forall~\xi\in S

and

xdomain constraintsx \in \text{domain constraints}

where Q(x,ξ)Q(x,\xi) is the optimal value of the second-stage problem for scenario ξ\xi. Enumerating all variables and constraints is infeasible at scale, and thus CCG restricts the master problem to a tractable subset, expanding it dynamically as guided by subproblem solutions.

2. Constraint Programming and Pricing Subproblem

After solving the master problem, CCG invokes a pricing or scenario-generation subproblem. In combinatorial testing, the pricing subproblem is formulated as a CP model that leverages dual variables (coverage “prices”) to construct candidate test configurations (columns) with negative reduced cost (Kadioglu, 2017):

r(t)=1pPaptπˉpr(t) = 1 - \sum_{p\in P} a_p^t \bar\pi_p

New columns are sought by maximizing the dual-weighted coverage (pattern variables subject to logical constraints):

maxtest configurations cCipcπˉp patternc,i\max\limits_{\text{test configurations}}~\sum_{c\in C}\sum_{i\in p_c} \bar\pi_p~\text{pattern}_{c,i}

subject to Boolean and domain constraints.

For robust optimization, the adversarial subproblem identifies a worst-case scenario (“column” or “cut”) by solving:

maxξU Z(x,{ξ})\max_{\xi\in U}~\underline{Z}(x,\{\xi\})

where Z\underline{Z} is the inner dual or minimization cost for scenario ξ\xi. The column/scenario with maximal violation or negative reduced cost is added to the master problem (Bertsimas et al., 2018, Tsang et al., 2022, Zhang et al., 30 May 2025).

For signal recovery or machine learning LPs, the pricing subproblem identifies variables (columns) and constraints violating optimality conditions (e.g. those with large dual residuals), often using KKT-based logic or dynamic programming methods (Mazumder et al., 2019, Dedieu et al., 2019). In set partitioning and clustering, dynamic constraint aggregation may group similar violated constraints to improve tractability (Sudoso et al., 8 Oct 2024).

3. Hybrid Decomposition and Algorithmic Innovations

CCG exploits hybrid decomposition:

  • Mathematical Programming (MP): Handles global optimization, dual information, and set covering or two-stage recourse structure.
  • Constraint Programming (CP): Efficient generation/filtering of combinatorial patterns, test configurations, or feasible routes, using logical and Boolean propagation (Kadioglu, 2017, Daryalal et al., 2021).
  • Duality-driven Benders or Dual Feasibility Oracles: Ensures feasibility of first-stage decisions in robust optimization by separating fast, approximate scenario generation from exact feasibility certification (Bertsimas et al., 2018).
  • Family Restricted Master Problems (FRMP): Stabilizes dual variables and accelerates convergence by augmenting the master problem with “families” of related columns (Haghani et al., 2021).
  • Dynamic Constraint Aggregation (DCA): Reduces degeneracy by clustering constraints, managing constraint explosion in large-scale set partitioning or clustering (Sudoso et al., 8 Oct 2024).
  • Data-driven and Learning-accelerated Variants: Historical feasibility, pointer networks, and neural approximators replace or accelerate iterative pricing (Duan et al., 2022, Shao et al., 14 Aug 2025, Chi et al., 2022).

4. Feasibility, Scalability, and Convergence Guarantees

Feasibility and scalability are key strengths of CCG:

  • Duality Driven Benders Decomposition (DDBD) extends CCG by integrating two oracles—fast approximate and slower exact feasible scenario search—to guarantee that returned first-stage solutions are feasible with respect to all second-stage recourses, even in absence of full recourse (Bertsimas et al., 2018).
  • Only relevant columns and constraints are generated as needed, ensuring that complexity grows adaptively. Finite convergence is proven for polyhedral uncertainty sets (worst-case scenario is always a vertex, and thus, finitely many iterations) (Bertsimas et al., 2018, Tsang et al., 2022).
  • Inexact CCG variants relax master problem optimality at each iteration, leveraging backtracking and adaptive gap tightening to ensure finite convergence and maintain valid lower bounds on the optimum (Tsang et al., 2022).
  • Family-based and aggregation-based master problem variants reduce oscillation and dramatically lower the number of iterations required for convergence (Haghani et al., 2021, Sudoso et al., 8 Oct 2024).

5. Practical Implementations and Real-world Applications

CCG algorithms have demonstrated efficacy across a range of large-scale, real-world domains:

  • Combinatorial Software Testing: Used as a cloud service to generate JUnit-ready parameterized tests with guaranteed tt-wise interaction coverage, scaling to heterogeneous domains and arbitrary coverage strengths. Cloud dashboards help visualize redundant coverage and diminishing marginal returns (Kadioglu, 2017).
  • Adaptive and Stochastic Optimization: Applied to facility location, energy unit commitment, distribution network reconfiguration with renewable generator resizing (DDU), and power systems. Mapping-based CCG accommodates decision-dependent uncertainty by explicit KKT-based scenario mapping (Zhang et al., 30 May 2025).
  • Machine Learning and Sparse Signal Recovery: Efficiently solves high-dimensional SVMs, Dantzig selector, Basis Pursuit, and Slope-SVM problems via hybrid column/constraint generation and warm-started Lasso initialization (Dedieu et al., 2019, Mazumder et al., 2019).
  • Clustering and Set Partitioning: DCA-accelerated CCG for minimum sum-of-squares clustering achieves computational advantage by reducing explicit constraint size and managing degeneracy (Sudoso et al., 8 Oct 2024).
  • Logistics, Bin Packing, Routing: Data-driven CCG leverages historical packing records, learning to price columns using pointer networks, thus improving packing success rate and computation time in manufacturing and logistics (Duan et al., 2022).
  • Network Migration and Scheduling: LBBD approaches combine column generation (Dantzig–Wolfe reformulation), CP scheduling, and Benders cuts to solve telecommunications migration and vehicle routing with synchronization constraints (Daryalal et al., 2021).
  • Capacity Sharing Networks: Exact column generation with single-constrained shortest path (SCSP) reformulation achieves optimal load balancing (MCF problem) and computational time over NP-hard network instances by integrating dual-based algorithms (Hu et al., 1 Nov 2024).

6. Extensions: Machine Learning and Reinforcement Learning-assisted CCG

Recent developments integrate machine learning and RL into CCG frameworks for further acceleration and policy improvement:

  • Neural CCG replaces repeated subproblem solves with neural network estimators trained on scenario-feature mappings, achieving up to 130× speedup while maintaining optimality gaps below 0.096% for two-stage stochastic unit commitment (Shao et al., 14 Aug 2025).
  • Deep RL-guided CG (RLCG) frames column selection as a sequential decision process. A graph neural network encodes the RMP’s variable-constraint bipartite structure, and a DQN agent is trained to select columns that reduce total iterations by 22–40% compared to greedy policies on benchmark CSP and VRPTW instances (Chi et al., 2022).
  • Learning to price with pointer networks, as in bin packing, directly selects feasible columns (historical packing records) to accelerate convergence (Duan et al., 2022).

7. Algorithmic Summary and Frequently Used Formulations

Core steps of CCG (generalized form):

  1. Restricted Master Problem (MP):
    • Solve for first-stage variables and covering variables over the current subset of columns/scenarios.
  2. Dual Extraction:
    • Obtain dual prices for constraints from the LP relaxation.
  3. Pricing/Subproblem:
    • Generate new columns or scenarios with maximal violation or negative reduced cost using CP model, dual-based search, pointer networks, or neural estimators.
  4. Feasibility Oracle (if required):
    • For robust optimization, check if first-stage variable is feasible for all scenarios using DDBD or backtracking routines.
  5. Column and Constraint Additions:
    • Augment the master problem, updating columns and constraints iteratively.
  6. Convergence Check:
    • Terminate if no violating columns/scenarios remain or optimality gap falls below tolerance.

Key LaTeX formulas:

  • Column generation reduced cost in dual variable terms:

r(t)=1pPaptπˉpr(t) = 1 - \sum_{p \in P} a_p^t \bar{\pi}_p

  • Dantzig Selector LP form:

min  β1s.t.XT(yXβ)λ\min \; ||\beta||_1 \quad \text{s.t.} \quad ||X^T(y - X\beta)||_\infty \leq \lambda

  • Robust master with inexact gap:

minx,δ cTx+δs.t. δQ(x,ξ), ξS\min_{x,\delta}~c^T x + \delta \quad \text{s.t.}~\delta \ge Q(x,\xi),~\forall \xi\in S

Termination gap:

ULU<ϵ\frac{U-L^\ell}{U} < \epsilon

  • Mapping-based CCG for decision-dependent uncertainty:

K(ξ,λt)={(w,π,θ):Gλt+Fπ+θ=0,Fwf,wξ,π0,θ0}\mathcal{K}(\boldsymbol{\xi}, \lambda_t^*) = \{ (\boldsymbol{w}, \boldsymbol{\pi}, \boldsymbol{\theta}) : G^\top \lambda_t^* + F^\top \boldsymbol{\pi} + \boldsymbol{\theta} = 0,\, F\boldsymbol{w} \le f,\, \boldsymbol{w} \le \boldsymbol{\xi},\, \boldsymbol{\pi} \ge 0,\, \boldsymbol{\theta} \ge 0 \}

References to Representative Papers

Objective Summary

Column-and-Constraint Generation is a versatile, scalable decomposition strategy enabling efficient solution of high-dimensional, combinatorial, and stochastic optimization problems. By integrating mathematical programming, constraint programming, family aggregation, feasibility oracle strategies, and recent advances in machine learning, CCG addresses computational tractability while maintaining solution optimality and feasibility guarantees across software engineering, robust optimization, machine learning, and operations research.

Forward Email Streamline Icon: https://streamlinehq.com

Follow Topic

Get notified by email when new papers are published related to Column-and-Constraint Generation (CCG) Algorithm.