predict.reglogit: Prediction for regularized (polychotomous) logistic...

View source: R/reglogit.R

predict.reglogitR Documentation

Prediction for regularized (polychotomous) logistic regression models

Description

Sampling from the posterior predictive distribution of a regularized (multinomial) logistic regression fit, including entropy information for variability assessment

Usage

## S3 method for class 'reglogit'
predict(object, XX, burnin = round(0.1 * nrow(object$beta)), ...)
## S3 method for class 'regmlogit'
predict(object, XX, burnin = round(0.1 * dim(object$beta)[1]), ...) 

Arguments

object

a "reglogit"-class object or a "regmlogit"-class object, depending on whether binary or polychotomous methods were used for fitting

XX

a matrix of predictive locations where ncol(XX) == object$ncol(XX).

burnin

a scalar positive integer indicate the number of samples of object$beta to discard as burn-in; the default is 10% of the number of samples

...

For compatibility with generic predict method; not used

Details

Applies the logit transformation (reglogit) or multinomial logit (regmlogit) to convert samples of the linear predictor at XX into a samples from a predictive posterior probability distribution. The raw probabilties, averages (posterior means), entropies, and posterior mean casses (arg-max of the average probabilities) are returned.

Value

The output is a list with components explained below. For predict.regmlogit everyhing (except entropy) is expanded by one dimension into an array or matrix as appropriate.

p

a nrow(XX) x (T-burnin) sized matrix of probabilities (of class 1) from the posterior predictive distribution.

mp

a vector of average probablities calculated over the rows of p

pc

class labels formed by rouding (or arg max for predict.regmlogit) the values in mp

ent

The posterior mean entropy given the probabilities in mp

Author(s)

Robert B. Gramacy rbg@vt.edu

References

R.B. Gramacy, N.G. Polson. “Simulation-based regularized logistic regression”. (2012) Bayesian Analysis, 7(3), p567-590; arXiv:1005.3430; https://arxiv.org/abs/1005.3430

C. Holmes, K. Held (2006). “Bayesian Auxilliary Variable Models for Binary and Multinomial Regression”. Bayesian Analysis, 1(1), p145-168.

See Also

reglogit and regmlogit

Examples

## see reglogit for a full example of binary classifiction complete with
## sampling from the posterior predictive distribution.  

## the example here is for polychotomous classification and prediction

## Not run: 
library(plgp)
x <- seq(-2, 2, length=40)
X <- expand.grid(x, x)
C <- exp2d.C(X)
xx <- seq(-2, 2, length=100)
XX <- expand.grid(xx, xx)
CC <- exp2d.C(XX)

## build cubically-expanded design matrix (with interactions)
Xe <- cbind(X, X[,1]^2, X[,2]^2, X[,1]*X[,2],
            X[,1]^3, X[,2]^3, X[,1]^2*X[,2], X[,2]^2*X[,1],
            (X[,1]*X[,2])^2)

## perform MCMC
T <- 1000
out <- regmlogit(T, C, Xe, nu=6, normalize=TRUE)

## create predictive (cubically-expanded) design matrix
XX <- as.matrix(XX)
XXe <- cbind(XX, XX[,1]^2, XX[,2]^2, XX[,1]*XX[,2],
             XX[,1]^3, XX[,2]^3, XX[,1]^2*XX[,2], XX[,2]^2*XX[,1],
             (XX[,1]*XX[,2])^2)

## predict class labels
p <- predict(out, XXe)

## make an image of the predictive surface
cols <- c(gray(0.85), gray(0.625), gray(0.4))
par(mfrow=c(1,3))
image(xx, xx, matrix(CC, ncol=length(xx)), col=cols, main="truth")
image(xx, xx, matrix(p$c, ncol=length(xx)), col=cols, main="predicted")
image(xx, xx, matrix(p$ent, ncol=length(xx)), col=heat.colors(128),
      main="entropy")

## End(Not run)

reglogit documentation built on April 25, 2023, 9:11 a.m.