Abstract
This paper studies first order methods for solving smooth minimax optimization problems
where g(·, ·) is smooth and g(x, ·) is concave for each x. In terms of g(·, y), we consider two settings – strongly convex and nonconvex – and improve upon the best known rates in both. For strongly-convex g(·, y),
we propose a new direct optimal algorithm combining Mirror-Prox and Nesterov’s AGD, and show that it can find global optimum in
iterations, improving over current state-of-the-art rate of O(1/k). We use this result along with an inexact proximal point method to provide
rate for finding stationary points in the nonconvex setting where g(·, y) can be nonconvex. This improves over current best-known rate of
Finally, we instantiate our result for finite nonconvex minimax problems, i.e.,
with nonconvex
to obtain convergence rate of 