资源论文Residual Expansion Algorithm: Fast and Effective Optimization for Nonconvex Least Squares Problems

Residual Expansion Algorithm: Fast and Effective Optimization for Nonconvex Least Squares Problems

2019-12-06 | |  77 |   45 |   0
Abstract We propose the residual expansion (RE) algorithm: a global (or near-global) optimization method for nonconvex least squares problems. Unlike most existing nonconvex optimization techniques, the RE algorithm is not based on either stochastic or multi-point searches; therefore, it can achieve fast global optimization. Moreover, the RE algorithm is easy to implement and successful in highdimensional optimization. The RE algorithm exhibits excellent empirical performance in terms of k-means clustering, point-set registration, optimized product quantization, and blind image deblurring.

上一篇:Reliable Crowdsourcing and Deep Locality-Preserving Learning for Expression Recognition in the Wild

下一篇:Revisiting Metric Learning for SPD Matrix based Visual Representation

用户评价
全部评价

热门资源

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Rating-Boosted La...

    The performance of a recommendation system reli...

  • Hierarchical Task...

    We extend hierarchical task network planning wi...