glpls1a.cv.error: Leave-one-out cross-validation error using IRWPLS and IRWPLSF...

Description Usage Arguments Value Author(s) References See Also Examples

View source: R/glpls1a.error.R

Description

Leave-one-out cross-validation training set classification error for fitting IRWPLS or IRWPLSF model for two group classification

Usage

1
glpls1a.cv.error(train.X,train.y, K.prov=NULL,eps=1e-3,lmax=100,family="binomial",link="logit",br=T)

Arguments

train.X

n by p design matrix (with no intercept term) for training set

train.y

response vector (0 or 1) for training set

K.prov

number of PLS components, default is the rank of train.X

eps

tolerance for convergence

lmax

maximum number of iteration allowed

family

glm family, binomial is the only relevant one here

link

link function, logit is the only one practically implemented now

br

TRUE if Firth's bias reduction procedure is used

Value

error

LOOCV training error

error.obs

the misclassified error observation indices

Author(s)

Beiying Ding, Robert Gentleman

References

See Also

glpls1a.train.test.error, glpls1a.mlogit.cv.error, glpls1a, glpls1a.mlogit,glpls1a.logit.all

Examples

1
2
3
4
5
6
7
 x <- matrix(rnorm(20),ncol=2)
 y <- sample(0:1,10,TRUE)

 ## no bias reduction
 glpls1a.cv.error(x,y,br=FALSE)
 ## bias reduction and 1 PLS component
 glpls1a.cv.error(x,y,K.prov=1, br=TRUE)

gpls documentation built on Nov. 8, 2020, 6:50 p.m.