Abstract
In this paper we study the fundamental problems of maximizing abcontinuous non-monotone submodular function over a hypercube, with and without coordinatewise concavity. This family of optimization problems has several applications in machine learning, economics, and communication systems. Our main result is the first -approximation algorithm for continuous submodular function maximization; the approximation factor of is the best possible for algorithms that use only polynomially many queries. For the special case of DR-submodular maximization, i.e., when the submodular functions is also coordinate-wise concave along all coordinates, we provide a faster -approximation algorithm that runs in almost linear time. Both of these results improve upon prior work [Bian et al., 2017a,b, Soma and Yoshida, 2017, Buchbinder et al., 2012, 2015]. Our first algorithm is a single-pass algorithm that uses novel ideas such as reducing the guaranteed approximation problem to analyzing a zero-sum game for each coordinate, and incorporates the geometry of this zero-sum game to fix the value at this coordinate. Our second algorithm is a faster single-pass algorithm that exploits coordinate-wise concavity to identify a monotone equilibrium condition sufficient for getting the required approximation guarantee, and hunts for the equilibrium point using binary search. We further run experiments to verify the performance of our proposed algorithms in related machine learning applications.