Description Usage Arguments Value References Examples
This function implements sparse frequentist generalized additive models (GAMs) with the group LASSO, group SCAD, and group MCP penalties. Let y_i denote the ith response and x_i denote a pdimensional vector of covariates. GAMs are of the form,
g(E(y_i)) = β_0 + ∑_{j=1}^{p} f_j (x_{ij}), i = 1, ..., n,
where g is a monotone increasing link function. The identity link function is used for Gaussian regression, the logit link is used for binomial regression, and the log link is used for Poisson, negative binomial, and gamma regression. The univariate functions are estimated using linear combinations of Bspline basis functions. Under group regularization of the basis coefficients, some of the univariate functions f_j(x_j) will be estimated as \hat{f}_j(x_j) = 0, depending on the size of the regularization parameter λ.
For implementation of sparse Bayesian GAMs with the SSGL penalty, use the SBGAM
function.
1 2 3 4 
y 
n \times 1 vector of responses for training data. 
X 
n \times p design matrix for training data, where the jth column of 
X.test 
n_{test} \times p design matrix for test data to calculate predictions. 
df 
number of Bspline basis functions to use in each basis expansion. Default is 
family 
exponential dispersion family. Allows for 
nb.size 
known size parameter α in NB(α,μ_i) distribution for negative binomial responses. Default is 
gamma.shape 
known shape parameter ν in Gamma(μ_i,ν) distribution for gamma responses. Default is 
penalty 
group regularization method to use on the groups of basis coefficients. The options are 
taper 
tapering term γ in group SCAD and group MCP controlling how rapidly the penalty tapers off. Default is 
nlambda 
number of regularization parameters L. Default is 
lambda 
grid of L regularization parameters. The user may specify either a scalar or a vector. If the user does not provide this, the program chooses the grid automatically. 
max.iter 
maximum number of iterations in the algorithm. Default is 
tol 
convergence threshold for algorithm. Default is 
The function returns a list containing the following components:
lambda 
L \times 1 vector of regularization parameters 
f.pred 
List of L n_{test} \times p matrices, where the kth matrix in the list corresponds to the kth regularization parameter in 
mu.pred 
n_{test} \times L matrix of predicted mean response values μ_{test} = E(Y_{test}) based on the test data in 
classifications 
p \times L matrix of classifications. An entry of "1" indicates that the corresponding function was classified as nonzero, and an entry of "0" indicates that the function was classified as zero. The kth column of 
beta0 
L \times 1 vector of estimated intercepts. The kth entry in 
beta 
dp \times L matrix of estimated basis coefficients. The kth column in 
loss 
vector of either the residual sum of squares ( 
Breheny, P. and Huang, J. (2015). "Group descent algorithms for nonconvex penalized linear and logistic regression models with grouped predictors." Statistics and Computing, 25:173187.
Wang, H. and Leng, C. (2007). "Unified LASSO estimation by least squares approximation." Journal of the American Statistical Association, 102:10391048.
Yuan, M. and Lin, Y. (2006). Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 68: 4967.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31  ## Generate data
set.seed(12345)
X = matrix(runif(100*20), nrow=100)
n = dim(X)[1]
y = 5*sin(2*pi*X[,1])5*cos(2*pi*X[,2]) + rnorm(n)
## Test data with 50 observations
X.test = matrix(runif(50*20), nrow=50)
## Kfold crossvalidation with group MCP penalty
cv.mod = cv.SFGAM(y, X, family="gaussian", penalty="gMCP")
## Plot CVE curve
plot(cv.mod$lambda, cv.mod$cve, type="l", xlab="lambda", ylab="CVE")
## lambda which minimizes crossvalidation error
lambda.opt = cv.mod$lambda.min
## Fit a single model with lambda.opt
SFGAM.mod = SFGAM(y, X, X.test, penalty="gMCP", lambda=lambda.opt)
## Classifications
SFGAM.mod$classifications
## Predicted function evaluations on test data
f.pred = SFGAM.mod$f.pred
## Plot estimated first function
x1 = X.test[,1]
f1.hat = f.pred[,1]
## Plot x_1 against f_1(x_1)
plot(x1[order(x1)], f1.hat[order(x1)], xlab=expression(x[1]),
ylab=expression(f[1](x[1])))

Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.