资源论文Efficient Algorithms for Smooth Minimax Optimization

Efficient Algorithms for Smooth Minimax Optimization

2020-02-23 | |  84 |   95 |   0

Abstract

This paper studies first order methods for solving smooth minimax optimization problems 图片.png where g(·, ·) is smooth and g(x, ·) is concave for each x. In terms of g(·, y), we consider two settings – strongly convex and nonconvex – and improve upon the best known rates in both. For strongly-convex g(·, y), 图片.png we propose a new direct optimal algorithm combining Mirror-Prox  and Nesterov’s AGD, and show that it can find global optimum in 图片.png iterations, improving over current state-of-the-art rate of O(1/k). We use this result along with an  inexact proximal point method to provide 图片.pngrate for finding stationary points in the nonconvex setting where g(·, y) can be nonconvex. This improves over current best-known rate of 图片.png Finally, we instantiate our result for finite nonconvex minimax problems, i.e., 图片.png with nonconvex 图片.png to obtain convergence rate of 图片.png

上一篇:Efficient Identification in Linear Structural Causal Models with Instrumental Cutsets

下一篇:The Case for Evaluating Causal Models Using Interventional Measures and Empirical Data

用户评价
全部评价

热门资源

  • Deep Cross-media ...

    Cross-media retrieval is a research hotspot in ...

  • Regularizing RNNs...

    Recently, caption generation with an encoder-de...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Joint Pose and Ex...

    Facial expression recognition (FER) is a challe...

  • Supervised Descen...

    Many computer vision problems (e.