资源论文Sparse Logistic Regression Learns All Discrete Pairwise Graphical Models

Sparse Logistic Regression Learns All Discrete Pairwise Graphical Models

2020-02-20 | |  40 |   35 |   0

Abstract

We characterize the effectiveness of a classical algorithm for recovering the Markov graph of a general discrete pairwise graphical model from i.i.d. samples. The algorithm is (appropriately regularized) maximum conditional log-likelihood, which involves solving a convex program for each node; for Ising models this is 图片.pngconstrained logistic regression, while for more general alphabets an 图片.png groupnorm constraint needs to be used. We show that this algorithm can recover any arbitrary discrete pairwise graphical model, and also characterize its sample complexity as a function of model width, alphabet size, edge parameter accuracy, and the number of variables. We show that along every one of these axes, it matches or improves on all existing results and algorithms for this problem. Our analysis applies a sharp generalization error bound for logistic regression when the weight vector has an 图片.png constraint and the sample vector has an 图片.png constraint. We also show that the proposed convex programs can be efficiently solved in 图片.png running time (where n is the number of variables) under the same statistical guarantees. We provide experimental results to support our analysis.

上一篇:Expressive power of tensor-network factorizations for probabilistic modeling

下一篇:Model Compression with Adversarial Robustness: A Unified Optimization Framework

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...