lasso: Bayesian Lasso

Description Usage Arguments Details Value See Also Examples

Description

Inference for Bayesian lasso regression models by Gibbs sampling from the Bayesian posterior distribution.

Usage

1
2
3
lasso(X, y, T=1000, lambda2=1, beta = NULL, s2=var(y-mean(y)),
           rd=NULL, ab=NULL, icept=TRUE,
           normalize=TRUE, device=0, parameters=NULL)

Arguments

X

data.frame, matrix, or vector of inputs X

y

vector of output responses y of length equal to the leading dimension (rows) of X, i.e., length(y) == nrow(X)

T

total number of MCMC samples to be collected

beta

initial setting of the regression coefficients.

lambda2

square of the initial lasso penalty parameter.

s2

initial variance parameter.

rd

=c(r, delta), the alpha (shape) parameter and beta (rate) parameter to the gamma distribution prior G(r,delta) for the lambda2 parameter under the lasso model. A default of NULL generates appropriate non-informative values depending on the nature of the regression.

ab

=c(a, b), the alpha (shape) parameter and the beta (scale) parameter for the inverse-gamma distribution prior IG(a,b) for the variance parameter s2. A default of NULL generates appropriate non-informative values depending on the nature of the regression.

icept

if TRUE, an implicit intercept term is fit in the model, otherwise the the intercept is zero; default is TRUE.

normalize

if TRUE, each variable is standardized to have unit L2-norm, otherwise it is left alone; default is TRUE.

device

If no external pointer is provided to function, we can provide the ID of the device to use.

parameters

a 9 dimensional vector of parameters to tune the GPU implementation.

Details

The Bayesian lasso model, hyperprior for the lasso parameter, and Gibbs Sampling algorithm implemented by this function are identical to that is described in detail in Park & Casella (2008). The GPU implementation is derived from the CPU implementation blasso from package monomvn.

Value

lasso returns an object of class "lasso", which is a list containing a copy of all of the input arguments as well as of the components listed below.

mu

a vector of T samples of the (un-penalized) “intercept” parameter.

beta

a T*ncol(X) matrix of T samples from the (penalized) regression coefficients.

s2

a vector of T samples of the variance parameter

lambda2

a vector of T samples of the penalty parameter.

tau2i

a T*ncol(X) matrix of T samples from the (latent) inverse diagonal of the prior covariance matrix for beta, obtained for Lasso regressions.

See Also

rpg,mlr

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
set.seed(0)
n_samples  <- 500
n_features <- 40
X <- matrix(rnorm(n_features * n_samples), nrow = n_samples)
y <- 2 * X[,1] - 3 * X[,2] + rnorm(n_samples) # only features 1 & 2 are relevant

X_train <- X[1:400,]
y_train <- y[1:400]
X_test  <- X[401:500,]
y_test  <- y[401:500]


# START ------------------------------------------------------------------------

# first, standardize data !!!
X_train <- scale(X_train)

tmp00 <- bayesCL::lasso(X = X_train, 
                          y = y_train, 
                          T = 500,  # number of Gibbs sampling iterations
                          icept = T,
                          device=0  ) # use constant term (intercept), we do


#scale test data based on train data means and scales!!
X_test <- scale(X_test,
                center = attr(X_train, "scaled:center"),
                scale = attr(X_train, "scaled:scale"))


p_train1 <- colMeans(tmp00$beta %*% t(X_train))
p_test1 <- colMeans(tmp00$beta %*% t(X_test))

plot(y_train, p_train1, col = "red", xlab = "actual", ylab = "predicted")
points(y_test, p_test1, col = "green")

bayesCL documentation built on May 2, 2019, 3:43 p.m.

Related to lasso in bayesCL...