资源论文Neural Networks and Rational Functions

Neural Networks and Rational Functions

2020-03-09 | |  52 |   37 |   0

Abstract

Neural networks and rational functions efficiently approximate each other. In more detail, it is shown here that for any ReLU network, there exists a rational function of degree O(poly log(1/ε)) which is -close, and similarly for any rational function there exists a ReLU network of size O(poly log(1/ε)) which is -close. By contrast, polynomials need degree Ω(poly(1/ε)) to approximate even a single ReLU. When converting a ReLU network to a rational function as above, the hidden constants depend exponentially on the number of layers, which is shown to be tight; in other words, a compositional representation can be beneficial even for rational functions.

上一篇:Robust Probabilistic Modeling with Bayesian Data Reweighting

下一篇:Connected Subgraph Detection with Mirror Descent on SDPs

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...