Description Usage Arguments Value Author(s) See Also Examples
Performs a convenience wrapper around cv.GAMBoost
for performing a K-fold cross-validation for GLMBoost
in search for the optimal number of boosting steps.
1 | cv.GLMBoost(x,y,penalty=length(y),just.criterion=TRUE,...)
|
y |
response vector of length |
x |
|
penalty |
penalty for the covariates with linear influence. |
just.criterion |
logical value indicating wether a list with the goodness-of-fit information should be returned or a |
... |
parameters to be passed to |
GLMBoost
fit with the optimal number of boosting steps or list with the following components:
criterion |
vector with goodness-of fit criterion for boosting step |
se |
vector with standard error estimates for the goodness-of-fit criterion in each boosting step. |
selected |
index of the optimal boosting step. |
Harald Binder binderh@uni-mainz.de
GLMBoost
, cv.GAMBoost
, GAMBoost
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 | ## Not run:
## Generate some data
x <- matrix(runif(100*8,min=-1,max=1),100,8)
eta <- -0.5 + 2*x[,1] + 4*x[,3]
y <- rbinom(100,1,binomial()$linkinv(eta))
## Fit the model with only linear components
gb1 <- GLMBoost(x,y,penalty=100,stepno=100,trace=TRUE,family=binomial())
## 10-fold cross-validation with prediction error as a criterion
gb1.crit <- cv.GLMBoost(x,y,penalty=100,maxstepno=100,trace=TRUE,
family=binomial(),
K=10,type="error")
## Compare AIC and estimated prediction error
which.min(gb1$AIC)
which.min(gb1.crit$criterion)
## End(Not run)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.