资源论文Anti-differentiating approximation algorithms: A case study with min-cuts, spectral, and flow

Anti-differentiating approximation algorithms: A case study with min-cuts, spectral, and flow

2020-03-03 | |  54 |   35 |   0

Abstract

We formalize and illustrate the general concept of algorithmic anti-differentiation: given an algorith mic procedure, e.g., an approximation algorithm for which worst-case approximation guarantees are available or a heuristic that has been engineered to be practically-useful but for which a pre cise theoretical understanding is lacking, an algorithmic anti-derivative is a precise statement of a optimization problem that is exactly solved by that procedure. We explore this concept with a case study of approximation algorithms for finding locally-biased partitions in data graphs, demonstrating connections between min-cut objectives, a personalized version of the popular PageRank vector, and the highly effective “push” procedure for computing an approximation to personalized PageRank. We show, for example, that this latter algorithm solves (exactly, but implicitly) an 图片.png-regularized 图片.png-regression problem, a fact that helps to explain its excellent performance in practice. We expect that, when available, these implicit optimization problems will be critical for rationalizing and predicting the performance of many approximation algorithms on realistic data.

上一篇:Dynamic Programming Boosting for Discriminative Macro-Action Discovery

下一篇:Preserving Modes and Messages via Diverse Particle Selection

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...