Abstract
The exponential mechanism is a general method to construct a randomized estimator that satisfies -differential privacy. Recently, Wang et al. showed that the Gibbs posterior, which is a data-dependent probability distribution that contains the Bayesian posterior, is essentially equivalent to the exponential mechanism under certain boundedness conditions on the loss function. While the exponential mechanism provides a way to build an -differential private algorithm, it requires boundedness of the loss function, which is quite stringent for some learning problems. In this paper, we focus on -differential privacy of Gibbs posteriors with convex and Lipschitz loss functions. Our result extends the classical exponential mechanism, allowing the loss functions to have an unbounded sensitivity.