hybrid.logistic: A Lasso-concave hybrid penalty for logistic regression

Description Usage Arguments Details Value Author(s) References See Also Examples

View source: R/cvplogistic.R

Description

Compute solution surface for a high-dimensional logistic regression model with Lasso-concave hybrid penalty for fast variable selection

Usage

1
2
hybrid.logistic(y, x, penalty = "mcp", kappa = 1/2.7,
nlambda = 100, lambda.min = 0.01, epsilon = 1e-3, maxit = 1e+3)

Arguments

y

response vector with elements 0 or 1.

x

the design matrix of penalized variables. By default, an intercept vector will be added when fitting the model.

penalty

a character specifying the penalty. One of "mcp" or "scad" should be specified, with "mcp" being the default.

kappa

a value specifying the regulation parameter kappa. The proper range for kappa is [0, 1).

nlambda

a integer value specifying the number of grids along the penalty parameter lambda.

lambda.min

a value specifying how to determine the minimal value of penalty parameter lambda. We define lambda_min=lambda_max*lambda.min. We suggest lambda.min=0.0001 if n>p; 0.01 otherwise.

epsilon

a value specifying the converge criterion of algorithm.

maxit

an integer value specifying the maximum number of iterations for each coordinate.

Details

A Lasso-concave hybrid penalty applies SCAD or MCP penalty only to the variables selected by Lasso. The idea is to use Lasso as a screen tool to filter variables, then apply the SCAD or MCP penalty to the variables selected by Lasso for further selection. The computation for the hybrid penalty is faster than the standard concave penalty. The risk of using the hybrid penalty is that the variable missed by Lasso will also not selected by the SCAD/MCP penalty.

Value

A list of two elements is returned.

coef

A matrix of dimension (p+1)*nlambda, with p the number of variables (columns) in x. The 1st row is the intercept, which is added by default.

lambdas

A vector of length nlambda for the penalty parameter lambda, ranging from the largest to the smallest.

Author(s)

Dingfeng Jiang

References

Dingfeng Jiang, Jian Huang. Majorization Minimization by Coordinate Descent for Concave Penalized Generalized Linear Models.

Zou, H., Li, R. (2008). One-step Sparse Estimates in Nonconcave Penalized Likelihood Models. Ann Stat, 364: 1509-1533.

Breheny, P., Huang, J. (2011). Coordinate Descent Algorithms for Nonconvex Penalized Regression, with Application to Biological Feature Selection. Ann Appl Stat, 5(1), 232-253.

Jiang, D., Huang, J., Zhang, Y. (2011). The Cross-validated AUC for MCP-Logistic Regression with High-dimensional Data. Stat Methods Med Res, online first, Nov 28, 2011.

See Also

cvplogistic, cv.cvplogistic, cv.hybrid, path.plot

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
set.seed(10000)
n=100
y=rbinom(n,1,0.4)
p=10
x=matrix(rnorm(n*p),n,p)

## Lasso-concave hybrid using MCP penalty
out=hybrid.logistic(y, x, "mcp")
## Lasso-concave hybrid using SCAD penalty
## out=hybrid.logistic(y, x, "scad")

Example output



cvplogistic documentation built on May 29, 2017, 11:34 p.m.