资源论文A Distributed Variational Inference Framework for Unifying Parallel Sparse Gaussian Process Regression Models

A Distributed Variational Inference Framework for Unifying Parallel Sparse Gaussian Process Regression Models

2020-03-05 | |  65 |   35 |   0

Abstract

This paper presents a novel distributed variational inference framework that unifies many parallel sparse Gaussian process regression (SGPR) models for scalable hyperparameter learning with big data. To achieve this, our framework exploits a structure of correlated noise process model that represents the observation noises as a finite realization of a high-order Gaussian Markov random process. By varying the Markov order and covariance function for the noise process model, different variational SGPR models result. This consequently allows the correlation structure of the noise process model to be characterized for which a particular variational SGPR model is optimal. We empirically evaluate the predictive performance and scalability of the distributed variational SGPR models unified by our framework on two real-world datasets.

上一篇:Revisiting Semi-Supervised Learning with Graph Embeddings

下一篇:The Segmented iHMM: A Simple, Efficient Hierarchical Infinite HMM

用户评价
全部评价

热门资源

  • Learning to Predi...

    Much of model-based reinforcement learning invo...

  • Stratified Strate...

    In this paper we introduce Stratified Strategy ...

  • The Variational S...

    Unlike traditional images which do not offer in...

  • A Mathematical Mo...

    Direct democracy, where each voter casts one vo...

  • Rating-Boosted La...

    The performance of a recommendation system reli...