Description Usage Arguments Value Elements for hmclearn objects References Examples
View source: R/mcmc_functions.R
This is the basic computing function for HMC and should not be called directly except by experienced users.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
N |
Number of MCMC samples |
theta.init |
Vector of initial values for the parameters |
epsilon |
Step-size parameter for |
L |
Number of |
logPOSTERIOR |
Function to calculate and return the log posterior given a vector of values of |
glogPOSTERIOR |
Function to calculate and return the gradient of the log posterior given a vector of values of |
varnames |
Optional vector of theta parameter names |
randlength |
Logical to determine whether to apply some randomness to the number of leapfrog steps tuning parameter |
Mdiag |
Optional vector of the diagonal of the mass matrix |
constrain |
Optional vector of which parameters in |
verbose |
Logical to determine whether to display the progress of the HMC algorithm |
... |
Additional parameters for |
List for hmc
hmclearn objectsNNumber of MCMC samples
thetaNested list of length N of the sampled values of theta for each chain
thetaCombinedList of dataframes containing sampled values, one for each chain
rList of length N of the sampled momenta
theta.allNested list of all parameter values of theta sampled prior to accept/reject step for each
r.allList of all values of the momenta r sampled prior to accept/reject
acceptNumber of accepted proposals. The ratio accept / N is the acceptance rate
accept_vVector of length N indicating which samples were accepted
MMass matrix used in the HMC algorithm
algorithmHMC for Hamiltonian Monte Carlo
Neal, Radford. 2011. MCMC Using Hamiltonian Dynamics. In Handbook of Markov Chain Monte Carlo, edited by Steve Brooks, Andrew Gelman, Galin L. Jones, and Xiao-Li Meng, 116–62. Chapman; Hall/CRC.
Betancourt, Michael. 2017. A Conceptual Introduction to Hamiltonian Monte Carlo.
Thomas, S., Tu, W. 2020. Learning Hamiltonian Monte Carlo in R.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 | # Logistic regression example
X <- cbind(1, seq(-100, 100, by=0.25))
betavals <- c(-0.9, 0.2)
lodds <- X %*% betavals
prob1 <- as.numeric(1 / (1 + exp(-lodds)))
set.seed(9874)
y <- sapply(prob1, function(xx) {
sample(c(0, 1), 1, prob=c(1-xx, xx))
})
f1 <- hmc.fit(N = 500,
theta.init = rep(0, 2),
epsilon = c(0.1, 0.002),
L = 10,
logPOSTERIOR = logistic_posterior,
glogPOSTERIOR = g_logistic_posterior,
y=y, X=X)
f1$accept / f1$N
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.