README.md

deepgp Package

Maintainer: Annie Sauer anniees@vt.edu

Performs posterior inference for deep Gaussian processes following Sauer, Gramacy, and Higdon (2020) . Models are trained through MCMC including elliptical slice sampling of latent Gaussian layers and Metropolis-Hastings sampling of kernel hyperparameters. Vecchia-approximation for faster computation is implemented following Sauer, Cooper, and Gramacy (2022) . Downstream tasks include sequential design through active learning Cohn/integrated mean squared error (ALC/IMSE; Sauer, Gramacy, and Higdon, 2020) and optimization through expected improvement (EI; Gramacy, Sauer, and Wycoff, 2021 ). Models extend up to three layers deep; a one layer model is equivalent to typical Gaussian process regression. Covariance kernel options are matern (default) and squared exponential. Applicable to both noisy and deterministic functions. Incorporates SNOW parallelization; utilizes C and C++ with OpenMP parallelization under the hood.

Run help("deepgp-package") or help(package = "deepgp") for more information.

References

Sauer, A., Gramacy, R.B., & Higdon, D. (2020). Active learning for deep Gaussian process surrogates. Technometrics, (just-accepted), 1-39.

Sauer, A., Cooper, A., & Gramacy, R. B. (2022). Vecchia-approximated deep Gaussian processes for computer experiments. pre-print on arXiv:2204.02904

Version History

What's new in version 1.0.1?

What's new in version 1.0.0?

What's new in version 0.3.0?



Try the deepgp package in your browser

Any scripts or data that you put into this service are public.

deepgp documentation built on June 21, 2022, 1:05 a.m.