View source: R/logistic_regression.R
vblogit | R Documentation |
Bayesian fit of logistic regression model. p coefficients, n observations.
vblogit( y, X, offset, eps = 0.01, m0, S0, S0i, xi0, verb = FALSE, maxiter = 1000, ... )
y |
binary vector of responses, length n |
X |
n x p matrix of covariates, including 1-column for intercept |
offset |
n-vector of offsets (or 1-vector which will be replicated) |
eps |
convergence criterion, increase in log-likelihood is no more than this |
m0 |
p-vector of prior means |
S0 |
p x p prior covariance matrix |
xi0 |
p-vector of initial |
verb |
verbose output, logical |
maxiter |
upper limit for iterations |
... |
ignored. Computes the posterior distribution of regression coefficients in logistic regression using the method of Jaakkola&Jordan 1996. |
## some data n <- 100 p <- 10 X <- matrix( rnorm(n*p), ncol=p) theta <- rnorm(p) prob <- 1/(1+exp(-X%*%theta)) y <- rbinom(n, 1, prob) ## See that it works: ## vb: fit_vb <- vblogit(y, X, verb=TRUE) ## glm: fit_glm <- glm(y ~ -1+X, family=binomial) coefs <- cbind(vb=fit_vb$coef, glm=fit_glm$coef) summary(fit_vb) ## compare vb and glm plot(coefs, main="Estimates") abline(0,1) ## Compare to true coefficients plot(coefs[,1]-theta) points(coefs[,2]-theta, col=3, pch=4) abline(h=0) legend("topright", c("glm","vblogit"), col=c(1,3), pch=c(1,4))
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.