Description Usage Arguments Details Value Author(s) References See Also Examples
Most general function in the package, providing an interface
to perform variable selection, hyperparameter tuning and
classification in one step. Alternatively, the first two
steps can be performed separately and can then be plugged into
this function.
For S4 method information, s. classification-methods
.
1 | classification(X, y, f, learningsets, genesel, genesellist = list(), nbgene, classifier, tuneres, tuninglist = list(), trace = TRUE, models=FALSE,...)
|
X |
Gene expression data. Can be one of the following:
|
y |
Class labels. Can be one of the following:
WARNING: The class labels will be re-coded to
range from |
f |
A two-sided formula, if |
learningsets |
An object of class |
genesel |
Optional (but usually recommended) object of class
|
genesellist |
In the case that the argument |
nbgene |
Number of best genes to be kept for classification, based
on either
|
classifier |
Name of function ending with |
tuneres |
Analogous to the argument |
tuninglist |
Analogous to the argument |
trace |
Should progress be traced ? Default is |
models |
a logical value indicating whether the model object shall be returned |
... |
Further arguments passed to the function |
For details about hyperparameter tuning, consult tune
.
A list of objects of class cloutput
and clvarseloutput
,
respectively; its length equals the number of different learningsets
.
The single elements of the list can convenienly be combined using
the join
function. The results can be analyzed and
evaluated by various measures using the method evaluation
.
Martin Slawski ms@cs.uni-sb.de
Anne-Laure Boulesteix boulesteix@ibe.med.uni-muenchen.de
Christoph Bernau bernau@ibe.med.uni-muenchen.de
Slawski, M. Daumer, M. Boulesteix, A.-L. (2008) CMA - A comprehensive Bioconductor package for supervised classification with high dimensional data. BMC Bioinformatics 9: 439
GeneSelection
, tune
, evaluation
,
compBoostCMA
, dldaCMA
, ElasticNetCMA
,
fdaCMA
, flexdaCMA
, gbmCMA
,
knnCMA
, ldaCMA
, LassoCMA
,
nnetCMA
, pknnCMA
, plrCMA
,
pls_ldaCMA
, pls_lrCMA
, pls_rfCMA
,
pnnCMA
, qdaCMA
, rfCMA
,
scdaCMA
, shrinkldaCMA
, svmCMA
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 | ### a simple k-nearest neighbour example
### datasets
## Not run: plot(x)
data(golub)
golubY <- golub[,1]
golubX <- as.matrix(golub[,-1])
### learningsets
set.seed(111)
lset <- GenerateLearningsets(y=golubY, method = "CV", fold=5, strat =TRUE)
### 1. GeneSelection
selttest <- GeneSelection(golubX, golubY, learningsets = lset, method = "t.test")
### 2. tuning
tunek <- tune(golubX, golubY, learningsets = lset, genesel = selttest, nbgene = 20, classifier = knnCMA)
### 3. classification
knn1 <- classification(golubX, golubY, learningsets = lset, genesel = selttest,
tuneres = tunek, nbgene = 20, classifier = knnCMA)
### steps 1.-3. combined into one step:
knn2 <- classification(golubX, golubY, learningsets = lset,
genesellist = list(method = "t.test"), classifier = knnCMA,
tuninglist = list(grids = list(k = c(1:8))), nbgene = 20)
### show and analyze results:
knnjoin <- join(knn2)
show(knn2)
eval <- evaluation(knn2, measure = "misclassification")
show(eval)
summary(eval)
boxplot(eval)
## End(Not run)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.