资源论文Efficient Algorithms for Non-convex Isotonic Regression through Submodular Optimization

Efficient Algorithms for Non-convex Isotonic Regression through Submodular Optimization

2020-02-13 | |  62 |   41 |   0

Abstract 

We consider the minimization of submodular functions subject to ordering constraints. We show that this potentially non-convex optimization problem can be cast as a convex optimization problem on a space of uni-dimensional measures, with ordering constraints corresponding to first-order stochastic dominance. We propose new discretization schemes that lead to simple and efficient algorithms based on zero-th, first, or higher order oracles; these algorithms also lead to improvements without isotonic constraints. Finally, our experiments show that non-convex loss functions can be much more robust to outliers for isotonic regression, while still being solvable in polynomial time.

上一篇:Scaling the Poisson GLM to massive neural datasets through polynomial approximations

下一篇:Joint Sub-bands Learning with Clique Structures for Wavelet Domain Super-Resolution

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...