资源论文Local Privacy and Minimax Bounds: Sharp Rates for Probability Estimation

Local Privacy and Minimax Bounds: Sharp Rates for Probability Estimation

2020-01-16 | |  96 |   58 |   0

Abstract

We provide a detailed study of the estimation of probability distributions— discrete and continuous—in a stringent setting in which data is kept private even from the statistician. We give sharp minimax rates of convergence for estimation in these locally private settings, exhibiting fundamental trade-offs between privacy and convergence rate, as well as providing tools to allow movement along the privacy-statistical efficiency continuum. One of the consequences of our results is that Warner’s classical work on randomized response is an optimal way to perform survey sampling while maintaining privacy of the respondents.

上一篇:Efficient Optimization for Sparse Gaussian Process Regression

下一篇:Inverse Density as an Inverse Problem: the Fredholm Equation Approach

用户评价
全部评价

热门资源

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • Learning to learn...

    The move from hand-designed features to learned...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...