资源论文TRANSFERRING OPTIMALITY ACROSS DATADISTRIBUTIONS VIA HOMOTOPY METHODS

TRANSFERRING OPTIMALITY ACROSS DATADISTRIBUTIONS VIA HOMOTOPY METHODS

2020-01-02 | |  82 |   52 |   0

Abstract

Homotopy methods, also known as continuation methods, are a powerful mathematical tool to efficiently solve various problems in numerical analysis. In this work, we propose a novel homotopy-based numerical method that can be used to gradually transfer optimized weights of a neural network across different data distributions. This method generalizes the widely-used heuristic of pre-training weights on one dataset and then fine-tuning them on another dataset of interest. We conduct a theoretical analysis showing that, under some assumptions, the homotopy method combined with Stochastic Gradient Descent (SGD) is guaranteed to converge to an 图片.png -optimal solution for a target task when started from an 图片.png -optimal solution on a source task. Empirical evaluations on a toy regression dataset and for transferring optimized weights from MNIST to Fashion-MNIST and CIFAR-10 show up to two orders of magnitude speedup over random initialization and substantial improvements over pre-training.

上一篇:ON THE VARIANCE OF THE ADAPTIVE LEARNINGR ATE AND BEYOND

下一篇:DATA -I NDEPENDENT NEURAL PRUNINGVIA CORESETS

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...