Description Usage Arguments Value Note Author(s) References See Also Examples
Roughly speaking, Boosting combines 'weak learners'
in a weighted manner in a stronger ensemble. This
method calls the function gbm.fit from the
package gbm. The 'weak learners' are
simple trees that need only very few splits (default: 1).
For S4 method information, see gbmCMA-methods.
1 |
X |
Gene expression data. Can be one of the following:
|
y |
Class labels. Can be one of the following:
WARNING: The class labels will be re-coded to
range from |
f |
A two-sided formula, if |
learnind |
An index vector specifying the observations that
belong to the learning set. May be |
models |
a logical value indicating whether the model object shall be returned |
... |
Further arguments passed to the function
|
An onject of class cloutput.
Up to now, this method can only be applied to binary classification.
Martin Slawski ms@cs.uni-sb.de
Anne-Laure Boulesteix boulesteix@ibe.med.uni-muenchen.de
Ridgeway, G. (1999).
The state of boosting.
Computing Science and Statistics, 31:172-181
Friedman, J. (2001).
Greedy Function Approximation: A Gradient Boosting Machine.
Annals of Statistics 29(5):1189-1232.
compBoostCMA, dldaCMA, ElasticNetCMA,
fdaCMA, flexdaCMA,
knnCMA, ldaCMA, LassoCMA,
nnetCMA, pknnCMA, plrCMA,
pls_ldaCMA, pls_lrCMA, pls_rfCMA,
pnnCMA, qdaCMA, rfCMA,
scdaCMA, shrinkldaCMA, svmCMA
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 | ### load Golub AML/ALL data
data(golub)
### extract class labels
golubY <- golub[,1]
### extract gene expression
golubX <- as.matrix(golub[,-1])
### select learningset
ratio <- 2/3
set.seed(111)
learnind <- sample(length(golubY), size=floor(ratio*length(golubY)))
### run tree-based gradient boosting (no tuning)
gbmresult <- gbmCMA(X=golubX, y=golubY, learnind=learnind, n.trees = 500)
show(gbmresult)
ftable(gbmresult)
plot(gbmresult)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.