资源论文Online and Differentially-Private Tensor Decomposition

Online and Differentially-Private Tensor Decomposition

2020-02-05 | |  83 |   41 |   0

Abstract

 Tensor decomposition is an important tool for big data analysis. In this paper, we resolve many of the key algorithmic questions regarding robustness, memory efficiency, and differential privacy of tensor decomposition. We propose simple variants of the tensor power method which enjoy these strong properties. We present the first guarantees for online tensor power method which has a linear memory requirement. Moreover, we present a noise calibrated tensor power method with efficient privacy guarantees. At the heart of all these guarantees lies a careful perturbation analysis derived in this paper which improves up on the existing results significantly. Keywords: Tensor decomposition, tensor power method, online methods, streaming, differential privacy, perturbation analysis.

上一篇:Budgeted stream-based active learning via adaptive submodular maximization

下一篇:A scalable end-to-end Gaussian process adapter for irregularly sampled time series classification

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...