SPCALDA: A New Reduced-Rank Linear Discriminant Analysis Method

Description Usage Arguments Value Author(s) Examples

View source: R/SPCALDA.R

Description

A new reduced-rank LDA method which works for high dimensional multi-class data.

Usage

1
SPCALDA(X,Y,rho=exp(c((-2):6)),K=min(20,min(dim(X))), folds = NULL)

Arguments

X

Input matrix, of dimension nobs x nvars; each row is an observation vector.

Y

Response variable for class label, of dimension nobs x 1.

rho

Tuning parameter.

K

The total number of principal components considered.

folds

Folds for cross-validation to select tuning parameter.

Value

ob

lda rule with top PCs

tuneRotation

Tuned rotaion matrix

minerror

Minimal training error

rho

tuned value of the parameter rho

K

tuned dimension, i.e., number of PCs

Author(s)

Yue S. Niu, Ning Hao and Bin Dong

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
set.seed(2015)
n = 200;p = 500

X = matrix(rnorm(n*p),n,p)
mu=matrix(0,4,p)
mu[1,1:125]=0.4;mu[2,126:250]=0.4;mu[3,251:375]=0.4;mu[4,376:500]=0.4
Y = rep(1:4,50)

for (g in 1:4) {
        index = which(Y == g)
        n_g = length(index)
        X[index,] = X[index,] + matrix(mu[g,],n_g,p,byrow=TRUE)
}

xtr = X[1:100,]; ytr=Y[1:100] #traning set
xte = X[101:200,]; yte =Y[101:200] # test set
folds = list(1:20,21:40,41:60,61:80,81:100)

spcaldaResult = SPCALDA(X=xtr,Y=ytr,rho=exp(c((-2):6)),K=20, folds = folds)
yhat = predict(spcaldaResult$ob,xte%*%spcaldaResult$tuneRotation)$class
error = sum(yhat != yte) 
 

Example output

Loading required package: MASS

SPCALDA documentation built on May 2, 2019, 2:45 p.m.