资源论文Fast and Scalable Bayesian Deep Learning by Weight-Perturbation in Adam

Fast and Scalable Bayesian Deep Learning by Weight-Perturbation in Adam

2020-03-20 | |  107 |   45 |   0

Abstract

Uncertainty computation in deep learning is essential to design robust and reliable systems. Vari ational inference (VI) is a promising approach for such computation, but requires more effort to implement and execute compared to maximumlikelihood methods. In this paper, we propose new natural-gradient algorithms to reduce such efforts for Gaussian mean-field VI. Our algorithms can be implemented within the Adam optimizer by perturbing the network weights during gradient evaluations, and uncertainty estimates can be cheaply obtained by using the vector that adapts the learning rate. This requires lower memory, computation, and implementation effort than existing VI methods, while obtaining uncertainty estimates of comparable quality. Our empirical results confirm this and further suggest that the weight-perturbation in our algorithm could be useful for exploration in reinforcement learning and stochastic optimization.

上一篇:ContextNet: Deep learning for Star Galaxy Classification

下一篇:Approximation Guarantees for Adaptive Sampling

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...