资源论文LIPSCHITZ CONSTANT ESTIMATION FOR NEURAL NET-WORKS VIA SPARSE POLYNOMIAL OPTIMIZATION

LIPSCHITZ CONSTANT ESTIMATION FOR NEURAL NET-WORKS VIA SPARSE POLYNOMIAL OPTIMIZATION

2019-12-30 | |  100 |   55 |   0

Abstract

We introduce LiPopt, a polynomial optimization framework for computing increasingly tighter upper bounds on the Lipschitz constant of neural networks. The underlying optimization problems boil down to either linear (LP) or semidefinite (SDP) programming. We show how to use structural properties of the network, such as sparsity, to significantly reduce the complexity of computation. This is specially useful for convolutional as well as pruned neural networks. We conduct experiments on networks with random weights as well as networks trained on MNIST, showing that in the particular case of the 图片.png-Lipschitz constant, our approach yields superior estimates as compared to other baselines available in the literature.

上一篇:ONE -S HOT PRUNING OF RECURRENT NEURAL NET-WORKS BY JACOBIAN SPECTRUM EVALUATION

下一篇:FAST NEURAL NETWORK ADAPTATION VIAPARAMETERS REMAPPING

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...