Description Usage Arguments Details Value Author(s) References See Also Examples
Roughly speaking, Boosting combines 'weak learners' in a weighted manner in a stronger ensemble.
'Weak learners' here consist of linear functions in one component (variable), as proposed by Buehlmann and Yu (2003).
It also generates sparsity and can as well be as
used for variable selection alone. (s. GeneSelection
).
For S4
method information, see compBoostCMA-methods.
1 | compBoostCMA(X, y, f, learnind, loss = c("binomial", "exp", "quadratic"), mstop = 100, nu = 0.1, models=FALSE, ...)
|
X |
Gene expression data. Can be one of the following:
|
y |
Class labels. Can be one of the following:
WARNING: The class labels will be re-coded to
range from |
f |
A two-sided formula, if |
learnind |
An index vector specifying the observations that
belong to the learning set. May be |
loss |
Character specifying the loss function - one of |
mstop |
Number of boosting iterations, i.e. number of updates
to perform. The default (100) does not necessarily produce
good results, therefore usage of |
nu |
Shrinkage factor applied to the update steps, defaults to 0.1.
In most cases, it suffices to set |
models |
a logical value indicating whether the model object shall be returned |
... |
Currently unused arguments. |
The method is partly based on code from the package mboost
from T. Hothorn and P. Buehlmann.
The algorithm for the multiclass case is described in Lutz and Buehlmann (2006) as 'rowwise updating'.
An object of class clvarseloutput
.
Martin Slawski ms@cs.uni-sb.de
Anne-Laure Boulesteix boulesteix@ibe.med.uni-muenchen.de
Buelmann, P., Yu, B. (2003).
Boosting with the L2 loss: Regression and Classification.
Journal of the American Statistical Association, 98, 324-339
Buehlmann, P., Hothorn, T.
Boosting: A statistical perspective.
Statistical Science (to appear)
Lutz, R., Buehlmann, P. (2006).
Boosting for high-multivariate responses in high-dimensional linear regression.
Statistica Sinica 16, 471-494.
dldaCMA
, ElasticNetCMA
,
fdaCMA
, flexdaCMA
, gbmCMA
,
knnCMA
, ldaCMA
, LassoCMA
,
nnetCMA
, pknnCMA
, plrCMA
,
pls_ldaCMA
, pls_lrCMA
, pls_rfCMA
,
pnnCMA
, qdaCMA
, rfCMA
,
scdaCMA
, shrinkldaCMA
, svmCMA
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 | ### load Golub AML/ALL data
data(golub)
### extract class labels
golubY <- golub[,1]
### extract gene expression
golubX <- as.matrix(golub[,-1])
### select learningset
ratio <- 2/3
set.seed(111)
learnind <- sample(length(golubY), size=floor(ratio*length(golubY)))
### run componentwise (logit)-boosting (not tuned)
result <- compBoostCMA(X=golubX, y=golubY, learnind=learnind, mstop = 500)
### show results
show(result)
ftable(result)
plot(result)
### multiclass example:
### load Khan data
data(khan)
### extract class labels
khanY <- khan[,1]
### extract gene expression
khanX <- as.matrix(khan[,-1])
### select learningset
set.seed(111)
learnind <- sample(length(khanY), size=floor(ratio*length(khanY)))
### run componentwise multivariate (logit)-boosting (not tuned)
result <- compBoostCMA(X=khanX, y=khanY, learnind=learnind, mstop = 1000)
### show results
show(result)
ftable(result)
plot(result)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.