Geometric Rescaling Algorithms for Submodular Function Minimization (1707.05065v4)
Abstract: We present a new class of polynomial-time algorithms for submodular function minimization (SFM), as well as a unified framework to obtain strongly polynomial SFM algorithms. Our algorithms are based on simple iterative methods for the minimum-norm problem, such as the conditional gradient and Fujishige-Wolfe algorithms. We exhibit two techniques to turn simple iterative methods into polynomial-time algorithms. Firstly, we adapt the geometric rescaling technique, which has recently gained attention in linear programming, to SFM and obtain a weakly polynomial bound $O(({n}4\cdot \textrm{EO} + {n}5)\log ({n} L))$. Secondly, we exhibit a general combinatorial black-box approach to turn $\varepsilon L$-approximate SFM oracles into strongly polynomial exact SFM algorithms. This framework can be applied to a wide range of combinatorial and continuous algorithms, including pseudo-polynomial ones. In particular, we can obtain strongly polynomial algorithms by a repeated application of the conditional gradient or of the Fujishige-Wolfe algorithm. Combined with the geometric rescaling technique, the black-box approach provides an $O(({n}5\cdot \textrm{EO} +{n}6)\log2{n})$ algorithm. Finally, we show that one of the techniques we develop in the paper can also be combined with the cutting-plane method of Lee, Sidford, and Wong \cite{LSW}, yielding a simplified variant of their $O(n3 \log2 n \cdot \textrm{EO} + n4\log{O(1)} n)$ algorithm.