tuning_loocv: Calculating Tuning Parameters Using looCV

Description Usage Arguments Details Value Author(s) References

View source: R/tuning.R

Description

Calculate tuning parameters based on given leave-one-out Cross Validation.

Usage

1
tuning_loocv(Y, X, K_mat, lambda)

Arguments

Y

(matrix, n*1) The vector of response variable.

X

(matrix, n*d_fix) The fixed effect matrix.

K_mat

(list of matrices) A nested list of kernel term matrices, corresponding to each kernel term specified in the formula for a base kernel function in kern_func_list.

lambda

(numeric) A numeric string specifying the range of tuning parameter to be chosen. The lower limit of lambda must be above 0.

Details

leave-one-out Cross Validation

λ_{n-CV}={argmin}_{λ \in Λ}\;\Big\{log\;y^{\star T}[I-diag(A_λ)-\frac{1}{n}I]^{-1}(I-A_λ)^2[I-diag(A_λ)- \frac{1}{n}I]^{-1}y^\star \Big\}

Value

lambda0

(numeric) The estimated tuning parameter.

Author(s)

Wenying Deng

References

Philip S. Boonstra, Bhramar Mukherjee, and Jeremy M. G. Taylor. A Small-Sample Choice of the Tuning Parameter in Ridge Regression. July 2015.

Trevor Hastie, Robert Tibshirani, and Jerome Friedman. The Elements of Statistical Learning: Data Mining, Inference, and Prediction, Second Edition. Springer Series in Statistics. Springer- Verlag, New York, 2 edition, 2009.

Hirotogu Akaike. Information Theory and an Extension of the Maximum Likelihood Principle. In Selected Papers of Hirotugu Akaike, Springer Series in Statistics, pages 199–213. Springer, New York, NY, 1998.

Clifford M. Hurvich and Chih-Ling Tsai. Regression and time series model selection in small samples. June 1989.

Hurvich Clifford M., Simonoff Jeffrey S., and Tsai Chih-Ling. Smoothing parameter selection in nonparametric regression using an improved Akaike information criterion. January 2002.


CVEK documentation built on Jan. 8, 2021, 5:42 p.m.