资源论文A THEORY OF USABLE INFORMATION UNDER COMPU -TATIONAL CONSTRAINTS

A THEORY OF USABLE INFORMATION UNDER COMPU -TATIONAL CONSTRAINTS

2020-01-02 | |  51 |   39 |   0

Abstract

We propose a new framework for reasoning about information in complex systems. Our foundation is based on a variational extension of Shannon’s information theory that takes into account the modeling power and computational constraints of the observer. The resulting predictive F-information encompasses mutual information and other notions of informativeness such as the coefficient of determination. Unlike Shannon’s mutual information and in violation of the data processing inequality, 图片.pnginformation can be created through computation. This is consistent with deep neural networks extracting hierarchies of progressively more informative features in representation learning. Additionally, we show that by incorporating computational constraints, 图片.pnginformation can be reliably estimated from data even in high dimensions with PAC-style guarantees. Empirically, we demonstrate predictive 图片.pnginformation is more effective than mutual information for structure learning and fair representation learning.

上一篇:TRUTH OR BACKPROPAGANDA ?A NEMPIRICAL IN -VESTIGATION OF DEEP LEARNING THEORY

下一篇:DEEP GRAPH MATCHING CONSENSUS

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...