资源论文Residual Expansion Algorithm: Fast and Effective Optimization for Nonconvex Least Squares Problems

Residual Expansion Algorithm: Fast and Effective Optimization for Nonconvex Least Squares Problems

2019-12-06 | |  102 |   64 |   0
Abstract We propose the residual expansion (RE) algorithm: a global (or near-global) optimization method for nonconvex least squares problems. Unlike most existing nonconvex optimization techniques, the RE algorithm is not based on either stochastic or multi-point searches; therefore, it can achieve fast global optimization. Moreover, the RE algorithm is easy to implement and successful in highdimensional optimization. The RE algorithm exhibits excellent empirical performance in terms of k-means clustering, point-set registration, optimized product quantization, and blind image deblurring.

上一篇:Reliable Crowdsourcing and Deep Locality-Preserving Learning for Expression Recognition in the Wild

下一篇:Revisiting Metric Learning for SPD Matrix based Visual Representation

用户评价
全部评价

热门资源

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Hierarchical Task...

    We extend hierarchical task network planning wi...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Shape-based Autom...

    We present an algorithm for automatic detection...