knitr::opts_chunk$set(collapse = TRUE,comment = "#")

susieR allows for a customized initialization. In this vignette we deomonstrate how to use L0Learn fit to initialize susieR.

library(susieR)
library(L0Learn)

Simulate data

First, we simulate data from the minimal example.

set.seed(1)
n = 1000
p = 1000
beta = rep(0,p)
beta[c(1,2,300,400)] = 1
X = matrix(rnorm(n*p),nrow=n,ncol=p)
y = X %*% beta + rnorm(n)

Fit L0Learn

We start with fitting a L0-regularized model to the simulated data.

set.seed(1)
L0fit = L0Learn.cvfit(X, y, penalty = "L0")

Let's choose the penalty strength parameter that minimizes the cross-validation error.

lambdaIndex = which.min(L0fit$cvMeans[[1]]) 
L0coef = as.numeric(coef(L0fit$fit, lambda = L0fit$fit$lambda[[1]][lambdaIndex]))
effect.beta = L0coef[which(L0coef!=0)][-1]
effect.index = (which(L0coef!=0)-1)[-1] 
length(effect.beta)
effect.beta[1:10]
effect.index[1:10]

The L0Learn model finds r length(effect.beta) effects, which will be used to initialize susie.

Build an initialization object

We create an initialization from l0learn coefficients using susie_init_coef function,

set.seed(1)
s.init = susie_init_coef(effect.index, effect.beta, p)

Run susieR with initialization

Now, we use effect.index and effect.beta obtained from L0Learn fit to create an s.init object. We then run susie with this initialization.

susieL0.fit = susie(X,y,s_init=s.init)
susieL0.fit$sets$cs

References

  1. Hussein Hazimeh and Rahul Mazumder. (2018). Fast Best Subset Selection: Coordinate Descent and Local Combinatorial Optimization Algorithms.


stephenslab/susieR documentation built on April 6, 2024, 9:33 p.m.