carfima: Fitting a CARFIMA(p, H, q) model via frequentist or Bayesian...

Description Usage Arguments Details Value Details References Examples

Description

A general-order CARFIMA(p, H, q) model for p>q is

Y_t^{(p)} -α_p Y_t^{(p-1)} -\cdots- α_1 Y_t = σ(B_{t, H}^{(1)}+β_1B_{t, H}^{(2)}+\cdots+β_q B_{t, H}^{(q+1)}),

where B_{t, H} = B_t^H is the standard fractional Brownian motion, H is the Hurst parameter, and the superscript (j) indicates j-fold differentiation with respect to t; see Equation (1) of Tsai and Chan (2005) for details. The model has p+q+2 unknown model parameters; p α_j's, q β_j's, H, and σ.

Usage

1
2
carfima(Y, time, ar.p, ma.q, method = c("mle", "bayes"), bayes.param.ini,
  bayes.param.scale, bayes.n.warm, bayes.n.sample)

Arguments

Y

A vector of length k for the observed data.

time

A vector of length k for the observation times.

ar.p

A positive integer for the order of the AR model. ar.p must be greater than ma.q. If ar.p is greater than 2, numerical errors may occur.

ma.q

A non-negative integer for the order of the MA model. ma.q must be smaller than ar.p.

method

Either "mle" or "bayes". Method "mle" conducts the MLE-based inference, producing MLEs and asymptotic uncertainties of the model parameters. Method "bayes" draws posterior samples of the model parameters.

bayes.param.ini

Only if method is "bayes". A vector of length p+q+2 for the initial values of p α_j's, q β_j's, H, and σ to implement a Markov chain Monte Carlo method (Metropolis within Gibbs sampler). When a CARFIMA(2, H, 1) model is fitted, for example, users should set five initial values of α_1, α_2, β_1, H, and σ.

bayes.param.scale

Only if method is "bayes". A vector of length p+q+2 for jumping (proposal) scales of the Metropolis steps. Each number determines how far a Metropolis step reaches out in each parameter space. Note that the last number of this vector is the jumping scale to update σ^2 on a log scale. The adaptive MCMC automatically adjusts these jumping scales during the run.

bayes.n.warm

Only if method is "bayes". A scalar for the number of burn-ins, i.e., the number of the first few iterations to be discarded to remove the effect of initial values.

bayes.n.sample

Only if method is "bayes". A scalar for the number of posterior samples for each parameter.

Details

The function carfima fits the model, producing either their maximum likelihood estimates (MLEs) with their asymptotic uncertainties or their posterior samples according to its argument, method. The MLEs except σ are obtained from a profile likelihood by a global optimizer called the differential evolution algorithm on restricted ranges, i.e., (-0.99, -0.01) for each α_j, (-10, 10) for each β_j, and (0.51, 0.99) for H; the MLE of σ is then deterministically computed. The corresponding asymptotic uncertainties are based on a numerical hessian matrix calculation at the MLEs (see function hessian in numDeriv). It also computes the Akaike Information Criterion (AIC) that is -2(log likelihood -p-q-2). The function carfima becomes numerically unstable if p>2, and thus it may produce numerical errors. (The original Fortran code used in Tsai and Chan (2005) does not have this numerical issue because they use a different global optimizer called DBCONF from the IMSL Fortran library.)

The Bayesian approach uses independent prior distributions for the unknown model parameters; a Uniform(-0.99, -0.01) prior for each α_j, a Normal(0, 10^4) prior for each β_j, a Uniform(0.51, 0.99) prior for H for long memory process, and finally an inverse-Gamma(shape = 2.01, scale = 10^3) prior for σ^2. Posterior propriety holds because the prior distributions are jointly proper. It also adopts appropriate proposal density functions; a truncated Normal(current state, proposal scale) between -0.99 and -0.01 for each α_j, a Normal(current state, proposal scale) for each β_j, a truncated Normal(current state, proposal scale) between 0.51 and 0.99 for H, and fianlly a Normal(log(current state), proposal scale on a log scale) for σ^2, i.e., σ^2 is updated on a log scale. We sample the full posterior using Metropolis within Gibbs sampler. It also adopts adaptive Markov chain Monte Carlo (MCMC) that updates the proposal scales every 100 iterations; if the acceptance rate of the most recent 100 proposals (at the end of the ith 100 iterationsis) smaller than 0.3 then it multiplies \exp(-\min(0.01, 1/√{i})) by the current proposal scale; if it is larger than 0.3 then it multiplies \exp(\min(0.01, 1/√{i})) by the current proposal scale. The Markov chains equipped with this adaptive MCMC converge to the stationary distribution because the adjustment factors, \exp(\pm\min(0.01, 1/√{i})), approach unity as i goes to infinity, satisfying the diminishing adaptation condition. The function carfima becomes numerically unstable if p>2, and thus it may produce numerical errors. The output returns the AIC for which we evaluate the log likelihood at the posterior medians of the unknown model parameters.

If the MLE-based method produces numerical errors, we recommend running the Bayesian method that is numerically more stable (though computationally more expensive).

Value

A name list composed of:

mle

If method is "mle". Maximum likelihood estimates of the model parameters, p α_j's, q β_j's, H, and σ.

se

If method is "mle". Asymptotic uncertainties (standard errors) of the MLEs.

param

If method is "bayes". An m by (p+q+2) matrix where m is the number of posterior draws (bayes.n.sample) and the columns correspond to parameters, p α_j's, q β_j's, H, and σ.

accept

If method is "bayes". A vector of length p+q+2 for the acceptance rates of the Metropolis steps.

AIC

For both methods. Akaike Information Criterion, -2(log likelihood -p-q-2). The log likelihood is evaluated at the MLEs if method is "mle" and at the posterior medians of the unknown model parameters if method is "bayes".

fitted.values

For both methods. A vector of length k for the values of E(Y_{t_i}\vert Y_{<t_i}), i=1, 2, …, k, where k is the number of observations and Y_{<t_i} represents all data observed before t_i. Note that E(Y_{t_1}\vert Y_{<t_1})=0. MLEs of the model parameters are used to compute E(Y_{t_i}\vert Y_{<t_i})'s if method is "mle" and posterior medians of the model parameters are used if method is "bayes".

Details

The function carfima produces MLEs, their asymptotic uncertainties, and AIC if method is "mle". It produces the posterior samples of the model parameters, acceptance rates, and AIC if method is "bayes".

References

\insertRef

tsai_note_2000carfima

\insertRef

tsai_maximum_2005carfima

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
##### Irregularly spaced observation time generation.

length.time <- 100
time.temp <- rexp(length.time, rate = 2)
time <- rep(NA, length.time + 1)
time[1] <- 0
for (i in 2 : (length.time + 1)) {
  time[i] <- time[i - 1] + time.temp[i - 1]
}
time <- time[-1]

##### Data genration for CARFIMA(1, H, 0) based on the observation times. 

parameter <- c(-0.4, 0.75, 0.2)
# AR parameter alpha = -0.4
# Hurst parameter = 0.75
# process uncertainty sigma = 0.2
y <- carfima.sim(parameter = parameter, time = time, ar.p = 1, ma.q = 0)  

#### Estimation 1 : MLE

res1 <- carfima(Y = y, time = time, method = "mle", ar.p = 1, ma.q = 0)

#### Estimation 2 : Bayes

res2 <- carfima(Y = y, time = time, method = "bayes", ar.p = 1, ma.q = 0, 
bayes.param.ini = parameter, bayes.param.scale = c(rep(0.2, length(parameter))),
bayes.n.warm = 100, bayes.n.sample = 1000)

kisungyou/carfima documentation built on May 13, 2019, 5:24 p.m.