data.GP | R Documentation |
Functions to supply data to PL for Gaussian process (GP) regression, classification, or combined unknown constraint models
data.GP(begin, end = NULL, X, Y) data.GP.improv(begin, end = NULL, f, rect, prior, adapt = ei.adapt, cands = 40, save = TRUE, oracle = TRUE, verb = 2, interp = interp.loess) data.CGP(begin, end = NULL, X, C) data.CGP.adapt(begin, end = NULL, f, rect, prior, cands = 40, verb = 2, interp=interp.loess) data.ConstGP(begin, end = NULL, X, Y, C) data.ConstGP.improv(begin, end = NULL, f, rect, prior, adapt = ieci.const.adapt , cands = 40, save = TRUE, oracle = TRUE, verb = 2, interp = interp.loess)
begin |
positive |
end |
positive |
X |
|
Y |
vector of length at least |
C |
vector of length at least |
f |
function returning a responses when called as |
rect |
bounding rectangle for the inputs |
prior |
prior parameters passed from |
adapt |
function that evaluates a sequential design criterion on
some candidate locations; the default |
cands |
number of Latin Hypercube candidate locations used to choose the next adaptively sampled input design point |
save |
scalar |
oracle |
scalar |
verb |
verbosity level for printing the progress of improv and other adaptive sampling calculations |
interp |
function for smoothing of 2-d image plots. The default comes
from |
These functions provide data to PL for Gaussian progress regression
and classification methods in a variety of ways. The simplest,
data.GP
and data.CGP
supply pre-recorded regression and
classification data stored in data frames and vectors;
data.ConstGP
is a hybrid that does joint regression and
classification. The other
functions provide data by active learning/sequential design:
The data.GP.improv
function uses expected improvement (EI);
data.CGP.improv
uses predictive entropy;
data.ConstGP.improv
uses integrated expected conditional improvement (IECI). In these
cases, once the x
-location(s) is/are chosen,
the function f
is used to provide the response(s)
The output are vectors or data.frame
s.
Robert B. Gramacy, rbg@vt.edu
Gramacy, R. and Polson, N. (2011). “Particle learning of Gaussian process models for sequential design and optimization.” Journal of Computational and Graphical Statistics, 20(1), pp. 102-118; arXiv:0909.5262
Gramacy, R. and Lee, H. (2010). “Optimization under unknown constraints”. Bayesian Statistics 9, J. M. Bernardo, M. J. Bayarri, J. O. Berger, A. P. Dawid, D. Heckerman, A. F. M. Smith and M. West (Eds.); Oxford University Press
Gramacy, R. (2020). “Surrogates: Gaussian Process Modeling, Design and Optimization for the Applied Sciences”. Chapman Hall/CRC; https://bobby.gramacy.com/surrogates/
https://bobby.gramacy.com/r_packages/plgp/
PL
## See the demos via demo(package="plgp") and the examples ## section of ?plgp
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.