资源论文Dimension-free Concentration Bounds on Hankel Matrices for Spectral Learning

Dimension-free Concentration Bounds on Hankel Matrices for Spectral Learning

2020-03-04 | |  54 |   32 |   0

Abstract

Learning probabilistic models over strings is an important issue for many applications. Spectral methods propose elegant solutions to the problem of inferring weighted automata from finite samples of variable-length strings drawn from an unknown target distribution. These methods rely on a singular value decomposition of a matrix HS , called the Hankel matrix, that records the frequencies of (some of) the observed strings. The accuracy of the learned distribution depends both on the quantity of information embedded in HS and on the distance between HS and its mean Hr . Existing concentration bounds seem to indicate that the concentration over Hr gets looser with its size, suggesting to make a tradeoff between the quantity of used information and the size of Hr . We propose new dimensionfree concentration bounds for several variants of Hankel matrices. Experiments demonstrate that these bounds are tight and that they significantly improve existing bounds. These results suggest that the concentration rate of the Hankel matrix around its mean does not constitute an argument for limiting its size.

上一篇:Statistical analysis of stochastic gradient methods for generalized linear models

下一篇:Globally Convergent Parallel MAP LP Relaxation Solver using the Frank-Wolfe Algorithm

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...