Description Usage Arguments Details Value See Also Examples
Fit a linear regression model via penalized maximum likelihood and cross-validation. Then, compute the difference statistic
W_j = |Z_j| - |\tilde{Z}_j|
where Z_j and \tilde{Z}_j are the coefficient estimates for the jth variable and its knockoff, respectively. The value of the regularization parameter λ is selected by cross-validation and computed with glmnet.
1 | MFKnockoffs.stat.lasso_coef_difference(X, X_k, y, cores = 2, ...)
|
X |
original design matrix (size n-by-p) |
X_k |
knockoff matrix (size n-by-p) |
y |
response vector (length n). It should be numeric |
cores |
Number of cores used to compute the knockoff statistics by running cv.glmnet. If not specified, the number of cores is set to approximately half of the number of cores detected by the parallel package. |
... |
additional arguments specific to 'glmnet' (see Details) |
This function uses the glmnet
package to fit the lasso path.
This function is a wrapper around the more general MFKnockoffs.stat.glmnet_coef_difference.
The knockoff statistics W_j are constructed by taking the difference between the coefficient of the j-th variable and its knockoff.
By default, the value of the regularization parameter is chosen by 10-fold cross-validation.
The optional nlambda
parameter can be used to control the granularity of the
grid of λ's. The default value of nlambda
is 100
,
where p
is the number of columns of X
.
Unless a lambda sequence is provided by the user, this function generates it on a log-linear scale before calling 'glmnet' (default 'nlambda': 100).
For a complete list of the available additional arguments, see cv.glmnet and glmnet.
A vector of statistics W (length p)
Other statistics for knockoffs: MFKnockoffs.stat.forward_selection
,
MFKnockoffs.stat.glmnet_coef_difference
,
MFKnockoffs.stat.glmnet_lambda_difference
,
MFKnockoffs.stat.lasso_coef_difference_bin
,
MFKnockoffs.stat.lasso_lambda_difference_bin
,
MFKnockoffs.stat.lasso_lambda_difference
,
MFKnockoffs.stat.random_forest
,
MFKnockoffs.stat.sqrt_lasso
,
MFKnockoffs.stat.stability_selection
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 | p=100; n=200; k=15
mu = rep(0,p); Sigma = diag(p)
X = matrix(rnorm(n*p),n)
nonzero = sample(p, k)
beta = 3.5 * (1:p %in% nonzero)
y = X %*% beta + rnorm(n)
knockoffs = function(X) MFKnockoffs.create.gaussian(X, mu, Sigma)
# Basic usage with default arguments
result = MFKnockoffs.filter(X, y, knockoffs=knockoffs,
statistic=MFKnockoffs.stat.lasso_coef_difference)
print(result$selected)
# Advanced usage with custom arguments
foo = MFKnockoffs.stat.lasso_coef_difference
k_stat = function(X, X_k, y) foo(X, X_k, y, nlambda=200)
result = MFKnockoffs.filter(X, y, knockoffs=knockoffs, statistic=k_stat)
print(result$selected)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.