资源论文Efficient Algorithms for Smooth Minimax Optimization

Efficient Algorithms for Smooth Minimax Optimization

2020-02-23 | |  44 |   49 |   0

Abstract

This paper studies first order methods for solving smooth minimax optimization problems 图片.png where g(·, ·) is smooth and g(x, ·) is concave for each x. In terms of g(·, y), we consider two settings – strongly convex and nonconvex – and improve upon the best known rates in both. For strongly-convex g(·, y), 图片.png we propose a new direct optimal algorithm combining Mirror-Prox  and Nesterov’s AGD, and show that it can find global optimum in 图片.png iterations, improving over current state-of-the-art rate of O(1/k). We use this result along with an  inexact proximal point method to provide 图片.pngrate for finding stationary points in the nonconvex setting where g(·, y) can be nonconvex. This improves over current best-known rate of 图片.png Finally, we instantiate our result for finite nonconvex minimax problems, i.e., 图片.png with nonconvex 图片.png to obtain convergence rate of 图片.png

上一篇:Efficient Identification in Linear Structural Causal Models with Instrumental Cutsets

下一篇:The Case for Evaluating Causal Models Using Interventional Measures and Empirical Data

用户评价
全部评价

热门资源

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Rating-Boosted La...

    The performance of a recommendation system reli...

  • Hierarchical Task...

    We extend hierarchical task network planning wi...