Taylor & Francis Group
Browse
ucgs_a_2129662_sm6400.pdf (441.01 kB)

Vecchia-approximated Deep Gaussian Processes for Computer Experiments

Download (441.01 kB)
Version 2 2022-11-08, 17:20
Version 1 2022-10-03, 17:20
journal contribution
posted on 2022-10-03, 17:20 authored by Annie Sauer, Andrew Cooper, Robert B. Gramacy

Deep Gaussian processes (DGPs) upgrade ordinary GPs through functional composition, in which intermediate GP layers warp the original inputs, providing flexibility to model non-stationary dynamics. Two DGP regimes have emerged in recent literature. A “big data” regime, prevalent in machine learning, favors approximate, optimization-based inference for fast, high-fidelity prediction. A “small data” regime, preferred for computer surrogate modeling, deploys posterior integration for enhanced uncertainty quantification (UQ). We aim to bridge this gap by expanding the capabilities of Bayesian DGP posterior inference through the incorporation of the Vecchia approximation, allowing linear computational scaling without compromising accuracy or UQ. We are motivated by surrogate modeling of simulation campaigns with upwards of 100,000 runs – a size too large for previous fully-Bayesian implementations – and demonstrate prediction and UQ superior to that of “big data” competitors. All methods are implemented in the deepgp package on CRAN.

History