| glmBselect | R Documentation |
Variable selection and fitting Linear Models using bayesian inference.
glmBselect(
formula,
data,
graphOutput = TRUE,
nIter = 10000,
thin = 1,
effect = "fixed"
)
formula |
an object of class "formula" (or one that can be coerced to that class): a symbolic description of the model to be fitted. |
data |
an optional data frame containing the variables in the model. If not found in data, the variables are taken from environment(formula), typically the environment from which blm is called. |
graphOutput |
regression parameters graphical output (MCMC Trace and posterior density) |
nIter |
number of iterations |
thin |
thinning interval for monitors |
effect |
"fixed", "random" or "randomPrior" effect for variable selection |
Models for lm are specified symbolically. A typical model has the form response ~ terms where response is the (numeric) response vector and terms is a series of terms which specifies a linear predictor for response. The effect "randomPrior" adds a beta prior for the model inclusion probability. This induces a distribution for the number of included variables which has longer tails than the binomial distribution, allowing the model to learn about the degree of sparsity.
regression parameters
JuG
O’Hara, R. and Sillanpaa, M. (2009) A review of Bayesian variable selection methods: what, how and which. Bayesian Analysis, 4(1):85-118.
Kuo, L. and Mallick, B. (1998) Variable selection for regression models. Sankhya B, 60(1):65-81.
#generate model data.
n <- 500
p <- 20
X <- matrix(rnorm(n*p),ncol=p)
beta <- 2^(0:(1-p))
print(beta)
alpha <- 3
tau <- 2
eps <- rnorm(n,0,1/sqrt(tau))
y <- as.numeric(cut(alpha+as.vector(X%*%beta + eps),c(-10,3,10)))-1
daten <- cbind(y,as.data.frame(X))
mod <- glm(y~.,data=daten, family='binomial')
require(MASS)
stepAIC(mod,direction = "both")
glmBselect(y~.,data=daten,nIter=10000,graphOutput=FALSE,effect="fixed")
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.