cv.rbst: Cross-Validation for Nonconvex Loss Boosting

cv.rbstR Documentation

Cross-Validation for Nonconvex Loss Boosting

Description

Cross-validated estimation of the empirical risk/error, can be used for tuning parameter selection.

Usage

cv.rbst(x, y, K = 10, cost = 0.5, rfamily = c("tgaussian", "thuber", "thinge", 
"tbinom", "binomd", "texpo", "tpoisson", "clossR", "closs", "gloss", "qloss"), 
learner = c("ls", "sm", "tree"), ctrl = bst_control(), type = c("loss", "error"), 
plot.it = TRUE, main = NULL, se = TRUE, n.cores=2,...)

Arguments

x

a data frame containing the variables in the model.

y

vector of responses. y must be in {1, -1} for binary classification

K

K-fold cross-validation

cost

price to pay for false positive, 0 < cost < 1; price of false negative is 1-cost.

rfamily

nonconvex loss function types.

learner

a character specifying the component-wise base learner to be used: ls linear models, sm smoothing splines, tree regression trees.

ctrl

an object of class bst_control.

type

cross-validation criteria. For type="loss", loss function values and type="error" is misclassification error.

plot.it

a logical value, to plot the estimated loss or error with cross validation if TRUE.

main

title of plot

se

a logical value, to plot with standard errors.

n.cores

The number of CPU cores to use. The cross-validation loop will attempt to send different CV folds off to different cores.

...

additional arguments.

Value

object with

residmat

empirical risks in each cross-validation at boosting iterations

mstop

boosting iteration steps at which CV curve should be computed.

cv

The CV curve at each value of mstop

cv.error

The standard error of the CV curve

rfamily

nonconvex loss function types.

...

Author(s)

Zhu Wang

See Also

rbst

Examples

## Not run: 
x <- matrix(rnorm(100*5),ncol=5)
c <- 2*x[,1]
p <- exp(c)/(exp(c)+exp(-c))
y <- rbinom(100,1,p)
y[y != 1] <- -1
x <- as.data.frame(x)
cv.rbst(x, y, ctrl = bst_control(mstop=50), rfamily = "thinge", learner = "ls", type="lose")
cv.rbst(x, y, ctrl = bst_control(mstop=50), rfamily = "thinge", learner = "ls", type="error")
dat.m <- rbst(x, y, ctrl = bst_control(mstop=50), rfamily = "thinge", learner = "ls")
dat.m1 <- cv.rbst(x, y, ctrl = bst_control(twinboost=TRUE, coefir=coef(dat.m), 
xselect.init = dat.m$xselect, mstop=50), family = "thinge", learner="ls")

## End(Not run)

bst documentation built on Jan. 7, 2023, 1:23 a.m.