This vignette demonstrates the performances of regularized regression methods in the toy example mentioned in Section 2, with a little modification. Here I use two famous regularized regression methods, lasso and EN.
# install.packages(c("glmnet", "ggplot2")) lapply(c("glmnet", "ggplot2"), require, character.only = TRUE)
Generate data
set.seed(2021) n <- 300 p <- 100 beta <- rep(0, p) beta[c(1, 4)] <- 1 X <- matrix(rnorm(n*p), nrow = n, ncol = p) X[ , 2] <- X[ , 1] + rnorm(n, 0, 0.1) X[ , 3] <- X[ , 4] + rnorm(n, 0, 0.1) y <- X %*% beta + rnorm(n) cor(X[ , 1], X[ , 2]) cor(X[ , 3], X[ , 4])
Normalize
X <- scale(X) y <- scale(y)
Lasso
fit_lasso <- glmnet(X, y, family = "gaussian", alpha = 1, nlambda = 20) plot(fit_lasso, xvar = "lambda", label = TRUE)
EN
fit_en <- glmnet(X, y, family = "gaussian", alpha = 0.5, nlambda = 20) plot(fit_en, xvar= "lambda", label = TRUE)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.