Abstract
Gaussian Process Regression (GPR) is a powerful
Bayesian method. However, the performance of
GPR can be significantly degraded when the training data are contaminated by outliers, including
target outliers and input outliers. Although there
are some variants of GPR (e.g., GPR with Student-t
likelihood (GPRT)) aiming to handle outliers, most
of the variants focus on handling the target outliers
while little effort has been done to deal with the
input outliers. In contrast, in this work, we aim to
handle both the target outliers and the input outliers
at the same time. Specifically, we replace the Gaussian noise in GPR with independent Student-t noise
to cope with the target outliers. Moreover, to enhance the robustness w.r.t. the input outliers, we use
a Student-t Process prior instead of the common
Gaussian Process prior, leading to Student-t Process Regression with Student-t Likelihood (TPRT).
We theoretically show that TPRT is more robust to
both input and target outliers than GPR and GPRT,
and prove that both GPR and GPRT are special
cases of TPRT. Various experiments demonstrate
that TPRT outperforms GPR and its variants on
both synthetic and real datasets