sda: Shrinkage Discriminant Analysis 2: Training Step

Description Usage Arguments Details Value Author(s) References See Also Examples

View source: R/sda.R

Description

sda trains a LDA or DDA classifier using James-Stein-type shrinkage estimation.

Usage

1
sda(Xtrain, L, lambda, lambda.var, lambda.freqs, diagonal=FALSE, verbose=TRUE)

Arguments

Xtrain

A matrix containing the training data set. Note that the rows correspond to observations and the columns to variables.

L

A factor with the class labels of the training samples.

lambda

Shrinkage intensity for the correlation matrix. If not specified it is estimated from the data. lambda=0 implies no shrinkage and lambda=1 complete shrinkage.

lambda.var

Shrinkage intensity for the variances. If not specified it is estimated from the data. lambda.var=0 implies no shrinkage and lambda.var=1 complete shrinkage.

lambda.freqs

Shrinkage intensity for the frequencies. If not specified it is estimated from the data. lambda.freqs=0 implies no shrinkage (i.e. empirical frequencies) and lambda.freqs=1 complete shrinkage (i.e. uniform frequencies).

diagonal

Chooses between LDA (default, diagonal=FALSE) and DDA (diagonal=TRUE).

verbose

Print out some info while computing.

Details

In order to train the LDA or DDA classifier, three separate shrinkage estimators are employed:

Note that the three corresponding regularization parameters are obtained analytically without resorting to computer intensive resampling.

Value

sda trains the classifier and returns an sda object with the following components needed for the subsequent prediction:

regularization

a vector containing the three estimated shrinkage intensities,

freqs

the estimated class frequencies,

alpha

vector containing the intercepts used for prediction,

beta

matrix containing the coefficients used for prediction.

Author(s)

Miika Ahdesm\"aki and Korbinian Strimmer (https://strimmerlab.github.io).

References

Ahdesm\"aki, A., and K. Strimmer. 2010. Feature selection in omics prediction problems using cat scores and false non-discovery rate control. Ann. Appl. Stat. 4: 503-519. <DOI:10.1214/09-AOAS277>

See Also

predict.sda, sda.ranking, freqs.shrink, var.shrink, invcor.shrink.

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
# load sda library
library("sda")

########################## 
# training and test data #
##########################

# data set containing the SRBCT samples
get.srbct = function()
{
  data(khan2001)
  idx = which( khan2001$y == "non-SRBCT" )
  x = khan2001$x[-idx,]
  y = factor(khan2001$y[-idx])
  descr = khan2001$descr[-idx]

  list(x=x, y=y, descr=descr)
}
srbct = get.srbct()

# training data
Xtrain = srbct$x[1:63,]
Ytrain = srbct$y[1:63]
Xtest = srbct$x[64:83,]
Ytest = srbct$y[64:83]


###################################################
# classification with correlation (shrinkage LDA) #
###################################################

sda.fit = sda(Xtrain, Ytrain)
ynew = predict(sda.fit, Xtest)$class # using all 2308 features
sum(ynew != Ytest) 

###########################################################
# classification with diagonal covariance (shrinkage DDA) #
###########################################################

sda.fit = sda(Xtrain, Ytrain, diagonal=TRUE)
ynew = predict(sda.fit, Xtest)$class # using all 2308 features
sum(ynew != Ytest) 

#################################################################
# for complete example scripts illustrating classification with #
# feature selection visit https://strimmerlab.github.io/software/sda/  #
#################################################################

sda documentation built on Nov. 22, 2021, 1:07 a.m.

Related to sda in sda...