Metropolis: Metropolis-Hastings Markov Chain Monte Carlo

Description Usage Arguments Details Value Author(s) References Examples

View source: R/Metropolis.R

Description

Uses the Metropolis-Hastings Markov Chain Monte Carlo (MCMC) method to determine an optimal model to fit some data set.

Usage

1
2
Metropolis(loglikelihood, sigma, m1, niter, gen, logproposal, logprior =
function(x) 0, burn = 0, save_int = 10, verbose, ...)

Arguments

loglikelihood

Function to calculate the log of a model's likelihood

sigma

Vector of standard deviations to use when generating a new model

m1

Starting "guess" model

niter

Number of iterations to run

gen

Function to generate a new model

logproposal

Function to calculate the proposal distribution for a new model

logprior

Function to calculate the log of the prior distribution value of a model

burn

Initial "burn-in" period from which results are not saved

save_int

Number of iterations between saved models

verbose

Logical: if TRUE, progress updates are printed to the screen

...

Parameters to pass to loglikelihood

Details

Metropolis prints progress information to the screen every 1000 iterations. These lines include the following:

Number of iterations completed out of total loglikelihood of current model loglikelihood of proposed model loglikelihood of best model found so far Whether the proposed model this round is rejected or accepted Acceptance ratio over the last 100 iterations

Value

List including the following elements:

m

Matrix where each row is the test model parameters of an iteration

l

log-likelihood of each iteration's model

a

Acceptance ratio (moving window of length 100)

best

List including best model found and its log-likelihood

Author(s)

Jake Anderson

References

Hastings, W.K. (1970). "Monte Carlo Sampling Methods Using Markov Chains and Their Applications". Biometrika 57 (1): 97-109.

Aster, R.C., Borchers, B., Thurber, C.H., Parameter Estimation and Inverse Problems, Elsevier Academic Press, 2012.

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
# try to find a non-negative vector m so that
# 1.  sqrt(m[1]^2 + m[2]^2) is close to 5
# 2.  sqrt(m[1] * m[2]) is close to 3.5
# 3.  2 * m[1] + m[2] is close to 10

# We are trying to match this data vector:
data = c(5, 3.5, 10)

# Define log-likelihood as -0.5 * sum of squared differences between
# modeled and true data
loglikelihood = function(model, data){
  d2 = c(sqrt(sum(model^2)), sqrt(abs(prod(model))), sum(model*c(2,1)))
  -0.5 * sum((d2 - data)^2)
}

# A proposed model is generated by randomly picking a model parameter
# and perturbing it by a random number distributed normally according to sigma
generate = function(x, sigma){
  w = ceiling(runif(1) * length(x))
  x[w] = x[w] + rnorm(1, 0, sigma[w])
  return(x)
}

# Proposal distribution is defined as multivariate normal, with mean
# zero and standard deviations sigma:
logproposal = function(x1, x2, sigma){
  -0.5 * sum(((x1) - (x2))^2/(sigma+1e-12)^2)
}

# logprior reflects prior knowledge that the answer is non-negative
logprior = function(m){
  if(all(m >= 0))
    0
  else
    -Inf
}


sigma = c(0.1, 0.1)
m1 = c(0, 0)
x = Metropolis(loglikelihood, sigma, m1, niter = 5000, gen = generate,
logproposal = logproposal, logprior = logprior, burn = 0, save_int = 1,
data = data)

# Notice the high acceptance ratios--this means that values in sigma are
# too low.  The MCMC is probably "optimally tuned" when sigma is set so 
# acceptance ratios vary between roughly 0.2 and 0.5.

# Plot models--par 1 on x, par 2 on y axis
# Note initial trajectory away from m1 (0, 0) to more likely
# region--this can be eliminated by setting 'burn' to a higher value
plot(x$m[,1], x$m[,2], pch = '.', col = rainbow(nrow(x$m)))

# Histograms/scatter plots showing posterior distributions.
# Note the strong covariance between these parameters!
par(mfrow = c(2, 2))
hist(x$m[,1])
plot(x$m[,2], x$m[,1], pch = '.')
plot(x$m[,1], x$m[,2], pch = '.')
hist(x$m[,2])

Example output

Loading required package: signal

Attaching package: 'signal'

The following objects are masked from 'package:stats':

    filter, poly

Loading required package: RSEIS
Loading required package: pracma

Attaching package: 'pracma'

The following objects are masked from 'package:RSEIS':

    detrend, hypot, logspace, peaks, trapz

The following objects are masked from 'package:signal':

    conv, ifft, interp1, pchip, polyval, roots


Attaching package: 'TDD'

The following object is masked from 'package:RSEIS':

    ReadInstr

[1] "1000 of 5000 ; -1.793 ; -2.954 ; -0.00976 ; reject ; 0.94"
[1] "2000 of 5000 ; -0.4821 ; -0.5279 ; -0.00976 ; accept ; 0.98"
[1] "3000 of 5000 ; -0.6082 ; -0.5921 ; -0.00211 ; accept ; 0.9"
[1] "4000 of 5000 ; -0.2032 ; -0.1868 ; -0.00211 ; accept ; 0.89"
[1] "5000 of 5000 ; -0.4365 ; -0.615 ; -0.00211 ; reject ; 0.91"

TDD documentation built on May 2, 2019, 4:51 a.m.

Related to Metropolis in TDD...