Compositionally-warped Gaussian processes

Rios, Gonzalo; Tobar, Felipe

Abstract

The Gaussian process (GP) is a nonparametric prior distribution over functions indexed by time, space, or other high-dimensional index set. The GP is a flexible model yet its limitation is given by its very nature: it can only model Gaussian marginal distributions. To model non-Gaussian data, a GP can be warped by a nonlinear transformation (or warping) as performed by warped GPs (WGPs) and more computationally-demanding alternatives such as Bayesian WGPs and deep GPs. However, the WGP requires a numerical approximation of the inverse warping for prediction, which increases the computational complexity in practice. To sidestep this issue, we construct a novel class of warpings consisting of compositions of multiple elementary functions, for which the inverse is known explicitly. We then propose the compositionally-warped GP (CWGP), a non-Gaussian generative model whose expressiveness follows from its deep compositional architecture, and its computational efficiency is guaranteed by the analytical inverse warping. Experimental validation using synthetic and real-world datasets confirms that the proposed CWGP is robust to the choice of warpings and provides more accurate point predictions, better trained models and shorter computation times than WGP. (C) 2019 Elsevier Ltd. All rights reserved.

Más información

Título según WOS: Compositionally-warped Gaussian processes
Título según SCOPUS: Compositionally-warped Gaussian processes
Título de la Revista: NEURAL NETWORKS
Volumen: 118
Editorial: PERGAMON-ELSEVIER SCIENCE LTD
Fecha de publicación: 2019
Página de inicio: 235
Página final: 246
Idioma: English
DOI:

10.1016/j.neunet.2019.06.012

Notas: ISI, SCOPUS - ISI