cv.GLMBoost: Cross-validation for GLMBoost fits

Description Usage Arguments Value Author(s) See Also Examples

Description

Performs a convenience wrapper around cv.GAMBoost for performing a K-fold cross-validation for GLMBoost in search for the optimal number of boosting steps.

Usage

1
cv.GLMBoost(x,y,penalty=length(y),just.criterion=TRUE,...)

Arguments

y

response vector of length n.

x

n * q matrix of covariates with linear influence.

penalty

penalty for the covariates with linear influence.

just.criterion

logical value indicating wether a list with the goodness-of-fit information should be returned or a GLMBoost fit with the optimal number of steps.

...

parameters to be passed to cv.GAMBoost or subsequently GAMBoost

Value

GLMBoost fit with the optimal number of boosting steps or list with the following components:

criterion

vector with goodness-of fit criterion for boosting step 1 , ... , maxstep

se

vector with standard error estimates for the goodness-of-fit criterion in each boosting step.

selected

index of the optimal boosting step.

Author(s)

Harald Binder binderh@uni-mainz.de

See Also

GLMBoost, cv.GAMBoost, GAMBoost

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
## Not run: 
##  Generate some data 
x <- matrix(runif(100*8,min=-1,max=1),100,8)             
eta <- -0.5 + 2*x[,1] + 4*x[,3]
y <- rbinom(100,1,binomial()$linkinv(eta))

##  Fit the model with only linear components
gb1 <- GLMBoost(x,y,penalty=100,stepno=100,trace=TRUE,family=binomial()) 


##  10-fold cross-validation with prediction error as a criterion
gb1.crit <- cv.GLMBoost(x,y,penalty=100,maxstepno=100,trace=TRUE,
                        family=binomial(),
                        K=10,type="error")

##  Compare AIC and estimated prediction error

which.min(gb1$AIC)          
which.min(gb1.crit$criterion)

## End(Not run)

GAMBoost documentation built on May 2, 2019, 12:40 p.m.