Perform kfold cross validation for elasticnet penalized Huber loss regression and quantile regression over a sequence of lambda values and find an optimal lambda.
1 2 
X 
The input matrix. 
y 
The response vector. 
... 
Additional arguments to 
FUN 
Model fitting function. The default is "hqreg" which preprocesses the data internally. The other option is "hqreg_raw" which uses the raw data as is. 
ncores 

nfolds 
The number of crossvalidation folds. Default is 10. 
fold.id 
(Optional) a vector of values between 1 and nfold indicating
which fold each observation belongs to. If supplied, nfolds can be missing.
By default the observations are randomly assigned by 
type.measure 
The default is "deviance", which uses the chosen loss function of the model. Other options include "mse" for mean squared error and "mae" for mean absolute error. 
seed 
(Optional) Seed for the random number generator in order to obtain reproducible results. 
The function randomly partitions the data in nfolds
. It calls hqreg
nfolds
+1 times, the first to obtain the lambda
sequence, and the remainder
to fit with each of the folds left out once for validation. The crossvalidation error is
the average of validation errors for the nfolds
fits.
Note that cv.hqreg
does not search for values of alpha
, gamma
or tau
.
Specific values should be supplied, otherwise the default ones for hqreg
are used.
If users would like to crossvalidate alpha
, gamma
or tau
as well,
they should call cv.hqreg
for each combination of these parameters and use the same
"seed" in these calls so that the partitioning remains the same.
The function returns an object of S3 class "cv.hqreg"
, which is a list containing:
cve 
The error for each value of 
cvse 
The estimated standard error associated with each value of 
type.measure 
Same as above. 
lambda 
The values of 
fit 
The fitted 
lambda.1se 
The largest 
lambda.min 
The value of 
Congrui Yi <congruiyi@uiowa.edu>
Yi, C. and Huang, J. (2016)
Semismooth Newton Coordinate Descent Algorithm for
ElasticNet Penalized Huber Loss Regression and Quantile Regression,
https://arxiv.org/abs/1509.02957
Journal of Computational and Graphical Statistics, accepted in Nov 2016
http://www.tandfonline.com/doi/full/10.1080/10618600.2016.1256816
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16  X = matrix(rnorm(1000*100), 1000, 100)
beta = rnorm(10)
eps = 4*rnorm(1000)
y = drop(X[,1:10] %*% beta + eps)
cv = cv.hqreg(X, y, seed = 123)
plot(cv)
cv_raw = cv.hqreg(X, y, FUN = "hqreg_raw", seed = 321)
predict(cv_raw, X[1:5,])
# parallel cross validation
## Not run:
cv_parallel = cv.hqreg(X, y, ncores = 5)
plot(cv_parallel)
## End(Not run)

Questions? Problems? Suggestions? Tweet to @rdrrHQ or email at ian@mutexlabs.com.
Please suggest features or report bugs with the GitHub issue tracker.
All documentation is copyright its authors; we didn't write any of that.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.