deepgp-package | R Documentation |
Performs posterior inference for deep Gaussian processes following Sauer, Gramacy, and Higdon (2020) <arXiv:2012.08015>. Models are trained through MCMC including elliptical slice sampling of latent Gaussian layers and Metropolis-Hastings sampling of kernel hyperparameters. Vecchia-approximation for faster computation is implemented following Sauer, Cooper, and Gramacy (2022) <arXiv:2204.02904>. Downstream tasks include sequential design through active learning Cohn/integrated mean squared error (ALC/IMSE; Sauer, Gramacy, and Higdon, 2020) and optimization through expected improvement (EI; Gramacy, Sauer, and Wycoff, 2021 <arXiv:2112.07457>). Models extend up to three layers deep; a one layer model is equivalent to typical Gaussian process regression. Covariance kernel options are matern (default) and squared exponential. Applicable to both noisy and deterministic functions. Incorporates SNOW parallelization and utilizes C and C++ under the hood.
fit_one_layer
: conducts MCMC sampling of
hyperparameters for a one layer GP
fit_two_layer
: conducts MCMC sampling of
hyperparameters and hidden layer for a two layer deep GP
fit_three_layer
: conducts MCMC sampling of
hyperparameters and hidden layers for a three layer deep GP
continue
: collects additional MCMC samples
trim
: cuts off burn-in and optionally thins
samples
predict
: calculates posterior mean and
variance over a set of input locations (optionally calculates EI)
plot
: produces trace plots, hidden layer
plots, and posterior plots
ALC
: calculates active learning Cohn over
set of input locations using reference grid
IMSE
: calculates integrated mean-squared error
over set of input locations
Annie Sauer anniees@vt.edu
Sauer, A, RB Gramacy, and D Higdon. 2021. "Active Learning for Deep Gaussian
Process Surrogates." Technometrics, (just-accepted), 1-39.
Sauer, A, A Cooper, and RB Gramacy. 2022. "Vecchia-approximated Deep Gaussian
Processes for Computer Experiments." pre-print on arXiv:2204.02904
Katzfuss, M, J Guinness, W Gong, and D Zilber. 2020. "Vecchia aproximations
of Gaussian-process predictions." Journal of Agricultural,
Biological, and Environmental Statistics 25, 383-414.
Binois, M, J Huang, RB Gramacy, and M Ludkovski. 2019. Replication or
Exploration? Sequential Design for Stochastic Simulation Experiments.
Technometrics 61, 7-23. Taylor & Francis.
doi:10.1080/00401706.2018.1469433.
Gramacy, RB. Surrogates: Gaussian Process Modeling, Design, and
Optimization for the Applied Sciences. Chapman Hall, 2020.
Jones, DR, M Schonlau, and WJ Welch. 1998. "Efficient Global Optimization
of Expensive Black-Box Functions." Journal of Global Optimization
13, 455-492. doi:10.1023/A:1008306431147.
Murray, I, RP Adams, and D MacKay. 2010. "Elliptical slice sampling."
Journal of Machine Learning Research 9, 541-548.
Seo, S, M Wallat, T Graepel, and K Obermayer. 2000. Gaussian Process
Regression: Active Data Selection and Test Point Rejection. In
Mustererkennung 2000, 27-34. New York, NY: Springer Verlag.
# See "fit_one_layer", "fit_two_layer", "fit_three_layer", # "ALC", or "IMSE" for examples # Examples of real-world implementations are available at: # https://bitbucket.org/gramacylab/deepgp-ex/
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.