Description Usage Arguments Details Value References See Also Examples
Does k-fold cross-validation for lrome, produces a plot,
and returns a value for lambda
. This function is modified based on the cv
function from the glmnet
package.
1 |
x |
|
y |
response variable or class label |
lambda |
optional user-supplied lambda sequence; default is
|
nfolds |
number of folds - default is 5. Although |
foldid |
an optional vector of values between 1 and |
delta |
parameter delta used in lasso huber regression for computing huber loss function |
... |
other arguments that can be passed to lrome. |
The function runs lrome
nfolds
+1 times; the
first to get the lambda
sequence, and then the remainder to
compute the fit with each of the folds omitted. The average error and standard deviation over the
folds are computed.
an object of class cv.lrome
is returned, which is a
list with the ingredients of the cross-validation fit.
lambda |
the values of |
cvm |
the mean cross-validated error - a vector of length
|
cvsd |
estimate of standard error of |
cvupper |
upper curve = |
cvlower |
lower curve = |
nzero |
number of non-zero coefficients at each |
name |
a text string indicating type of measure (for plotting purposes). |
lrome.fit |
a fitted |
lambda.min |
The optimal value of |
lambda.1se |
The largest value of |
Yang, Y. and Zou, H. (2012), "An Efficient Algorithm for Computing The HHSVM and Its Generalizations," Journal of Computational and Graphical Statistics, 22, 396-415.
BugReport: https://github.com/emeryyi/fastcox.git
Friedman, J., Hastie, T., and Tibshirani, R. (2010), "Regularization paths for generalized
linear models via coordinate descent," Journal of Statistical Software, 33, 1.
http://www.jstatsoft.org/v33/i01/
lrome
, plot.cv.lrome
, predict.cv.lrome
, and coef.cv.lrome
methods.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 | # fit an elastic net penalized HHSVM
# with lambda2 = 0.1 for the L2 penalty. Use the
# misclassification rate as the cross validation
# prediction loss. Use five-fold CV to choose
# the optimal lambda for the L1 penalty.
data(FHT)
set.seed(2011)
cv=cv.lrome(FHT$x, FHT$y, lambda2=0.1,
nfolds=5, delta=1.5)
plot(cv)
# fit an elastic net penalized least squares
# with lambda2 = 0.1 for the L2 penalty. Use the
# least square loss as the cross validation
# prediction loss. Use five-fold CV to choose
# the optimal lambda for the L1 penalty.
set.seed(2011)
cv1=cv.lrome(FHT$x, FHT$y_reg,
lambda2=0.1, nfolds=5)
plot(cv1)
# To fit a LASSO penalized logistic regression
# we set lambda2 = 0 to disable the L2 penalty. Use the
# logistic loss as the cross validation
# prediction loss. Use five-fold CV to choose
# the optimal lambda for the L1 penalty.
set.seed(2011)
cv2=cv.lrome(FHT$x, FHT$y, lambda2 = 0, nfolds=5)
plot(cv2)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.