Description Usage Arguments Value Elements for hmclearn objects Available logPOSTERIOR and glogPOSTERIOR functions Author(s) References Examples
View source: R/mcmc_functions.R
This function runs the HMC algorithm on a generic model provided
the logPOSTERIOR and gradient glogPOSTERIOR functions.
All parameters specified within the list paramare passed to these two functions.
The tuning parameters epsilon and L are passed to the
Leapfrog algorithm.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 |
N |
Number of MCMC samples |
theta.init |
Vector of initial values for the parameters |
epsilon |
Step-size parameter for |
L |
Number of |
logPOSTERIOR |
Function to calculate and return the log posterior given a vector of values of |
glogPOSTERIOR |
Function to calculate and return the gradient of the log posterior given a vector of values of |
randlength |
Logical to determine whether to apply some randomness to the number of leapfrog steps tuning parameter |
Mdiag |
Optional vector of the diagonal of the mass matrix |
constrain |
Optional vector of which parameters in |
verbose |
Logical to determine whether to display the progress of the HMC algorithm |
varnames |
Optional vector of theta parameter names |
param |
List of additional parameters for |
chains |
Number of MCMC chains to run |
parallel |
Logical to set whether multiple MCMC chains should be run in parallel |
... |
Additional parameters for |
Object of class hmclearn
hmclearn objectsNNumber of MCMC samples
thetaNested list of length N of the sampled values of theta for each chain
thetaCombinedList of dataframes containing sampled values, one for each chain
rList of length N of the sampled momenta
theta.allNested list of all parameter values of theta sampled prior to accept/reject step for each
r.allList of all values of the momenta r sampled prior to accept/reject
acceptNumber of accepted proposals. The ratio accept / N is the acceptance rate
accept_vVector of length N indicating which samples were accepted
MMass matrix used in the HMC algorithm
algorithmHMC for Hamiltonian Monte Carlo
varnamesOptional vector of parameter names
chainsNumber of MCMC chains
logPOSTERIOR and glogPOSTERIOR functionslinear_posteriorLinear regression: log posterior
g_linear_posteriorLinear regression: gradient of the log posterior
logistic_posteriorLogistic regression: log posterior
g_logistic_posteriorLogistic regression: gradient of the log posterior
poisson_posteriorPoisson (count) regression: log posterior
g_poisson_posteriorPoisson (count) regression: gradient of the log posterior
lmm_posteriorLinear mixed effects model: log posterior
g_lmm_posteriorLinear mixed effects model: gradient of the log posterior
glmm_bin_posteriorLogistic mixed effects model: log posterior
g_glmm_bin_posteriorLogistic mixed effects model: gradient of the log posterior
glmm_poisson_posteriorPoisson mixed effects model: log posterior
g_glmm_poisson_posteriorPoisson mixed effects model: gradient of the log posterior
Samuel Thomas samthoma@iu.edu, Wanzhu Tu wtu@iu.edu
Neal, Radford. 2011. MCMC Using Hamiltonian Dynamics. In Handbook of Markov Chain Monte Carlo, edited by Steve Brooks, Andrew Gelman, Galin L. Jones, and Xiao-Li Meng, 116–62. Chapman; Hall/CRC.
Betancourt, Michael. 2017. A Conceptual Introduction to Hamiltonian Monte Carlo.
Thomas, S., Tu, W. 2020. Learning Hamiltonian Monte Carlo in R.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 | # Linear regression example
set.seed(521)
X <- cbind(1, matrix(rnorm(300), ncol=3))
betavals <- c(0.5, -1, 2, -3)
y <- X%*%betavals + rnorm(100, sd=.2)
fm1_hmc <- hmc(N = 500,
theta.init = c(rep(0, 4), 1),
epsilon = 0.01,
L = 10,
logPOSTERIOR = linear_posterior,
glogPOSTERIOR = g_linear_posterior,
varnames = c(paste0("beta", 0:3), "log_sigma_sq"),
param=list(y=y, X=X), parallel=FALSE, chains=1)
summary(fm1_hmc, burnin=100)
# poisson regression example
set.seed(7363)
X <- cbind(1, matrix(rnorm(40), ncol=2))
betavals <- c(0.8, -0.5, 1.1)
lmu <- X %*% betavals
y <- sapply(exp(lmu), FUN = rpois, n=1)
fm2_hmc <- hmc(N = 500,
theta.init = rep(0, 3),
epsilon = 0.01,
L = 10,
logPOSTERIOR = poisson_posterior,
glogPOSTERIOR = g_poisson_posterior,
varnames = paste0("beta", 0:2),
param = list(y=y, X=X),
parallel=FALSE, chains=1)
summary(fm2_hmc, burnin=100)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.