资源论文Scaling Gaussian Process Regression with Derivatives

Scaling Gaussian Process Regression with Derivatives

2020-02-14 | |  69 |   94 |   0

Abstract 

Gaussian processes (GPs) with derivatives are useful in many applications, including Bayesian optimization, implicit surface reconstruction, and terrain reconstruction. Fitting a GP to function values and derivatives at n points in d dimensions requires linear solves and log determinants with an n(d + 1) × n(d + 1) positive definite matrix – leading to prohibitive image.png computations for standard direct methods. We propose iterative solvers using fast O(nd) matrix-vector multiplications (MVMs), together with pivoted Cholesky preconditioning that cuts the iterations to convergence by several orders of magnitude, allowing for fast kernel learning and prediction. Our approaches, together with dimensionality reduction, enables Bayesian optimization with derivatives to scale to high-dimensional problems and large evaluation budgets.

上一篇:Plug-in Estimation in High-Dimensional Linear Inverse Problems: A Rigorous Analysis

下一篇:Learning Attractor Dynamics for Generative Memory

用户评价
全部评价

热门资源

  • Regularizing RNNs...

    Recently, caption generation with an encoder-de...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • Deep Cross-media ...

    Cross-media retrieval is a research hotspot in ...

  • Supervised Descen...

    Many computer vision problems (e.

  • Learning Expressi...

    Facial expression is temporally dynamic event w...