资源论文Least Squares Revisited: Scalable Approaches for Multi-class Prediction

Least Squares Revisited: Scalable Approaches for Multi-class Prediction

2020-03-03 | |  54 |   31 |   0

Abstract

This work provides simple algorithms for multiclass (and multi-label) prediction in settings where both the number of examples n and the data dimension d are relatively large. These robust and parameter free algorithms are essentially iterative least-squares updates and very ver satile both in theory and in practice. On the theoretical front, we present several variants with convergence guarantees. Owing to their effective use of second-order structure, these algorithms are substantially better than first-order methods in many practical scenarios. On the empirical side, we show how to scale our approach to high dimensional datasets, achieving dramatic computational speedups over popular optimization packages such as Liblinear and Vowpal Wabbit on standard datasets (MNIST and CIFAR-10), while attaining state-of-the-art accuracies.

上一篇:A Unified Framework for Consistency of Regularized Loss Minimizers

下一篇:Optimal PAC Multiple Arm Identification with Applications to Crowdsourcing

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...