newGP | R Documentation |
Build a Gaussian process C-side object based on the X
-Z
data and parameters provided, and augment those objects with new
data
newGP(X, Z, d, g, dK = FALSE)
newGPsep(X, Z, d, g, dK = FALSE)
updateGP(gpi, X, Z, verb = 0)
updateGPsep(gpsepi, X, Z, verb = 0)
X |
a |
Z |
a vector of responses/dependent values with |
d |
a positive scalar lengthscale parameter for an isotropic Gaussian
correlation function ( |
g |
a positive scalar nugget parameter |
dK |
a scalar logical indicating whether or not derivative information
(for the lengthscale) should be maintained by the GP object;
this is required for calculating MLEs/MAPs of the lengthscale
parameter(s) via |
gpi |
a C-side GP object identifier (positive integer); e.g., as returned by |
gpsepi |
similar to |
verb |
a non-negative integer indicating the verbosity level. A positive value
causes progress statements to be printed to the screen for each
update of |
newGP
allocates a new GP object on the C-side and returns its
unique integer identifier (gpi
), taking time which is cubic on
nrow(X)
; allocated GP objects must (eventually) be destroyed
with deleteGP
or deleteGPs
or memory will leak.
The same applies for newGPsep
, except deploying a separable
correlation with limited feature set; see deleteGPsep
and
deleteGPseps
updateGP
takes gpi
identifier as
input and augments that GP with new data. A sequence of updates is
performed, for each i in 1:nrow(X)
, each taking time which is
quadratic in the number of data points.
updateGP
also updates any statistics needed in order to quickly
search for new local design candidates via laGP
.
The same applies to updateGPsep
on gpsepi
objects
newGP
and newGPsep
create
a unique GP indicator (gpi
or gpsepi
) referencing a C-side object;
updateGP
and updateGPsep
do not return anything, but yields a
modified C-side object as a side effect
Robert B. Gramacy rbg@vt.edu
For standard GP inference, refer to any graduate text, e.g., Rasmussen & Williams Gaussian Processes for Machine Learning, or
Gramacy, R. B. (2020) Surrogates: Gaussian Process Modeling, Design and Optimization for the Applied Sciences. Boca Raton, Florida: Chapman Hall/CRC. (See Chapter 6.) https://bobby.gramacy.com/surrogates/
For efficient updates of GPs, see:
R.B. Gramacy and D.W. Apley (2015). Local Gaussian process approximation for large computer experiments. Journal of Computational and Graphical Statistics, 24(2), pp. 561-678; preprint on arXiv:1303.0383; https://arxiv.org/abs/1303.0383
vignette("laGP")
,
deleteGP
, mleGP
, predGP
, laGP
## for more examples, see predGP and mleGP docs
## simple sine data
X <- matrix(seq(0,2*pi,length=7), ncol=1)
Z <- sin(X)
## new GP fit
gpi <- newGP(X, Z, 2, 0.000001)
## make predictions
XX <- matrix(seq(-1,2*pi+1, length=499), ncol=ncol(X))
p <- predGP(gpi, XX)
## sample from the predictive distribution
if(require(mvtnorm)) {
N <- 100
ZZ <- rmvt(N, p$Sigma, p$df)
ZZ <- ZZ + t(matrix(rep(p$mean, N), ncol=N))
matplot(XX, t(ZZ), col="gray", lwd=0.5, lty=1, type="l",
xlab="x", ylab="f-hat(x)", bty="n")
points(X, Z, pch=19, cex=2)
}
## update with four more points
X2 <- matrix(c(pi/2, 3*pi/2, -0.5, 2*pi+0.5), ncol=1)
Z2 <- sin(X2)
updateGP(gpi, X2, Z2)
## make a new set of predictions
p2 <- predGP(gpi, XX)
if(require(mvtnorm)) {
ZZ <- rmvt(N, p2$Sigma, p2$df)
ZZ <- ZZ + t(matrix(rep(p2$mean, N), ncol=N))
matplot(XX, t(ZZ), col="gray", lwd=0.5, lty=1, type="l",
xlab="x", ylab="f-hat(x)", bty="n")
points(X, Z, pch=19, cex=2)
points(X2, Z2, pch=19, cex=2, col=2)
}
## clean up
deleteGP(gpi)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.