cv.lrome: Cross-validation for lrome

Description Usage Arguments Details Value References See Also Examples

View source: R/cv.lrome.R

Description

Does k-fold cross-validation for lrome, produces a plot, and returns a value for lambda. This function is modified based on the cv function from the glmnet package.

Usage

1
cv.lrome(x, y, lambda = NULL, nfolds = 5, foldid, delta=2,...)

Arguments

x

x matrix as in lrome.

y

response variable or class label y as in lrome.

lambda

optional user-supplied lambda sequence; default is NULL, and lrome chooses its own sequence.

nfolds

number of folds - default is 5. Although nfolds can be as large as the sample size (leave-one-out CV), it is not recommended for large datasets. Smallest value allowable is nfolds=3.

foldid

an optional vector of values between 1 and nfold identifying what fold each observation is in. If supplied, nfold can be missing.

delta

parameter delta used in lasso huber regression for computing huber loss function

...

other arguments that can be passed to lrome.

Details

The function runs lrome nfolds+1 times; the first to get the lambda sequence, and then the remainder to compute the fit with each of the folds omitted. The average error and standard deviation over the folds are computed.

Value

an object of class cv.lrome is returned, which is a list with the ingredients of the cross-validation fit.

lambda

the values of lambda used in the fits.

cvm

the mean cross-validated error - a vector of length length(lambda).

cvsd

estimate of standard error of cvm.

cvupper

upper curve = cvm+cvsd.

cvlower

lower curve = cvm-cvsd.

nzero

number of non-zero coefficients at each lambda.

name

a text string indicating type of measure (for plotting purposes).

lrome.fit

a fitted lrome object for the full data.

lambda.min

The optimal value of lambda that gives minimum cross validation error cvm.

lambda.1se

The largest value of lambda such that error is within 1 standard error of the minimum.

References

Yang, Y. and Zou, H. (2012), "An Efficient Algorithm for Computing The HHSVM and Its Generalizations," Journal of Computational and Graphical Statistics, 22, 396-415.
BugReport: https://github.com/emeryyi/fastcox.git

Friedman, J., Hastie, T., and Tibshirani, R. (2010), "Regularization paths for generalized linear models via coordinate descent," Journal of Statistical Software, 33, 1.
http://www.jstatsoft.org/v33/i01/

See Also

lrome, plot.cv.lrome, predict.cv.lrome, and coef.cv.lrome methods.

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
# fit an elastic net penalized HHSVM 
# with lambda2 = 0.1 for the L2 penalty. Use the 
# misclassification rate as the cross validation 
# prediction loss. Use five-fold CV to choose 
# the optimal lambda for the L1 penalty.

data(FHT)
set.seed(2011)
cv=cv.lrome(FHT$x, FHT$y, lambda2=0.1, 
	nfolds=5, delta=1.5)
plot(cv)

# fit an elastic net penalized least squares 
# with lambda2 = 0.1 for the L2 penalty. Use the 
# least square loss as the cross validation 
# prediction loss. Use five-fold CV to choose 
# the optimal lambda for the L1 penalty.

set.seed(2011)
cv1=cv.lrome(FHT$x, FHT$y_reg,  
lambda2=0.1, nfolds=5)
plot(cv1)

# To fit a LASSO penalized logistic regression
# we set lambda2 = 0 to disable the L2 penalty. Use the 
# logistic loss as the cross validation 
# prediction loss. Use five-fold CV to choose 
# the optimal lambda for the L1 penalty.

set.seed(2011)
cv2=cv.lrome(FHT$x, FHT$y, lambda2 = 0, nfolds=5)
plot(cv2)

emeryyi/rome documentation built on May 6, 2019, 9:53 a.m.