| GauPro_Gauss | R Documentation |
Corr Gauss GP using inherited optim
Corr Gauss GP using inherited optim
R6Class object.
Object of R6Class with methods for fitting GP model.
GauPro::GauPro -> GauPro_Gauss
corrName of correlation
thetaCorrelation parameters
theta_lengthLength of theta
theta_mapMap for theta
theta_shortShort vector for theta
separableAre the dimensions separable?
GauPro::GauPro$cool1Dplot()GauPro::GauPro$deviance_searchnug()GauPro::GauPro$fit()GauPro::GauPro$grad_norm()GauPro::GauPro$initialize_GauPr()GauPro::GauPro$loglikelihood()GauPro::GauPro$nugget_update()GauPro::GauPro$optim()GauPro::GauPro$optimRestart()GauPro::GauPro$plot()GauPro::GauPro$plot1D()GauPro::GauPro$plot2D()GauPro::GauPro$pred()GauPro::GauPro$pred_LOO()GauPro::GauPro$pred_mean()GauPro::GauPro$pred_meanC()GauPro::GauPro$pred_one_matrix()GauPro::GauPro$pred_var()GauPro::GauPro$predict()GauPro::GauPro$sample()GauPro::GauPro$update()GauPro::GauPro$update_K_and_estimates()GauPro::GauPro$update_corrparams()GauPro::GauPro$update_data()GauPro::GauPro$update_nugget()new()Create GauPro object
GauPro_Gauss$new( X, Z, verbose = 0, separable = T, useC = F, useGrad = T, parallel = FALSE, nug = 1e-06, nug.min = 1e-08, nug.est = T, param.est = T, theta = NULL, theta_short = NULL, theta_map = NULL, ... )
XMatrix whose rows are the input points
ZOutput points corresponding to X
verboseAmount of stuff to print. 0 is little, 2 is a lot.
separableAre dimensions separable?
useCShould C code be used when possible? Should be faster.
useGradShould the gradient be used?
parallelShould code be run in parallel? Make optimization faster but uses more computer resources.
nugValue for the nugget. The starting value if estimating it.
nug.minMinimum allowable value for the nugget.
nug.estShould the nugget be estimated?
param.estShould the kernel parameters be estimated?
thetaCorrelation parameters
theta_shortCorrelation parameters, not recommended
theta_mapCorrelation parameters, not recommended
...Not used
corr_func()Correlation function
GauPro_Gauss$corr_func(x, x2 = NULL, theta = self$theta)
xFirst point
x2Second point
thetaCorrelation parameter
deviance_theta()Calculate deviance
GauPro_Gauss$deviance_theta(theta)
thetaCorrelation parameter
deviance_theta_log()Calculate deviance
GauPro_Gauss$deviance_theta_log(beta)
betaCorrelation parameter on log scale
deviance()Calculate deviance
GauPro_Gauss$deviance(theta = self$theta, nug = self$nug)
thetaCorrelation parameter
nugNugget
deviance_grad()Calculate deviance gradient
GauPro_Gauss$deviance_grad( theta = NULL, nug = self$nug, joint = NULL, overwhat = if (self$nug.est) "joint" else "theta" )
thetaCorrelation parameter
nugNugget
jointCalculate over theta and nug at same time?
overwhatCalculate over theta and nug at same time?
deviance_fngr()Calculate deviance and gradient at same time
GauPro_Gauss$deviance_fngr( theta = NULL, nug = NULL, overwhat = if (self$nug.est) "joint" else "theta" )
thetaCorrelation parameter
nugNugget
overwhatCalculate over theta and nug at same time?
jointCalculate over theta and nug at same time?
deviance_log()Calculate deviance gradient
GauPro_Gauss$deviance_log(beta = NULL, nug = self$nug, joint = NULL)
betaCorrelation parameter on log scale
nugNugget
jointCalculate over theta and nug at same time?
deviance_log2()Calculate deviance on log scale
GauPro_Gauss$deviance_log2(beta = NULL, lognug = NULL, joint = NULL)
betaCorrelation parameter on log scale
lognugLog of nugget
jointCalculate over theta and nug at same time?
deviance_log_grad()Calculate deviance gradient on log scale
GauPro_Gauss$deviance_log_grad( beta = NULL, nug = self$nug, joint = NULL, overwhat = if (self$nug.est) "joint" else "theta" )
betaCorrelation parameter
nugNugget
jointCalculate over theta and nug at same time?
overwhatCalculate over theta and nug at same time?
deviance_log2_grad()Calculate deviance gradient on log scale
GauPro_Gauss$deviance_log2_grad( beta = NULL, lognug = NULL, joint = NULL, overwhat = if (self$nug.est) "joint" else "theta" )
betaCorrelation parameter
lognugLog of nugget
jointCalculate over theta and nug at same time?
overwhatCalculate over theta and nug at same time?
deviance_log2_fngr()Calculate deviance and gradient on log scale
GauPro_Gauss$deviance_log2_fngr( beta = NULL, lognug = NULL, joint = NULL, overwhat = if (self$nug.est) "joint" else "theta" )
betaCorrelation parameter
lognugLog of nugget
jointCalculate over theta and nug at same time?
overwhatCalculate over theta and nug at same time?
get_optim_functions()Get optimization functions
GauPro_Gauss$get_optim_functions(param_update, nug.update)
param_updateShould the parameters be updated?
nug.updateShould the nugget be updated?
param_optim_lower()Lower bound of params
GauPro_Gauss$param_optim_lower()
param_optim_upper()Upper bound of params
GauPro_Gauss$param_optim_upper()
param_optim_start()Start value of params for optim
GauPro_Gauss$param_optim_start()
param_optim_start0()Start value of params for optim
GauPro_Gauss$param_optim_start0()
param_optim_jitter()Jitter value of params for optim
GauPro_Gauss$param_optim_jitter(param_value)
param_valueparam value to add jitter to
update_params()Update value of params after optim
GauPro_Gauss$update_params(restarts, param_update, nug.update)
restartsNumber of restarts
param_updateAre the params being updated?
nug.updateIs the nugget being updated?
grad()Calculate the gradient
GauPro_Gauss$grad(XX)
XXPoints to calculate grad at
grad_dist()Calculate the gradient distribution
GauPro_Gauss$grad_dist(XX)
XXPoints to calculate grad at
hessian()Calculate the hessian
GauPro_Gauss$hessian(XX, useC = self$useC)
XXPoints to calculate grad at
useCShould C code be used to speed up?
print()Print this object
GauPro_Gauss$print()
clone()The objects of this class are cloneable with this method.
GauPro_Gauss$clone(deep = FALSE)
deepWhether to make a deep clone.
n <- 12
x <- matrix(seq(0,1,length.out = n), ncol=1)
y <- sin(2*pi*x) + rnorm(n,0,1e-1)
gp <- GauPro_Gauss$new(X=x, Z=y, parallel=FALSE)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.