View source: R/lag_selection.R
| select_lag_maic | R Documentation |
Selects the optimal number of lags for the ADF regression using the Modified Akaike Information Criterion (MAIC) of Ng and Perron (2001).
select_lag_maic(y, maxlag = NULL, detrend = "constant")
y |
Numeric vector. Time series data. |
maxlag |
Integer or |
detrend |
Character. Detrending method: "constant" (demean) or "none". Default is "constant". |
The MAIC criterion is defined as:
MAIC(k) = \ln(\hat{\sigma}^2_k) + 2(k+1)/T
where \hat{\sigma}^2_k is the residual variance from the ADF
regression with k lags.
This criterion provides better size properties than standard AIC for unit root testing.
A list with class "lag_selection" containing:
selected_lag |
Optimal lag selected by MAIC |
maic |
MAIC value at optimal lag |
all_maic |
Vector of MAIC values for all lags |
maxlag |
Maximum lag considered |
Ng, S., & Perron, P. (2001). Lag length selection and the construction of unit root tests with good size and power. Econometrica, 69(6), 1519-1554. \Sexpr[results=rd]{tools:::Rd_expr_doi("10.1111/1468-0262.00256")}
# Generate random walk
set.seed(123)
y <- cumsum(rnorm(200))
# Select lag
lag_sel <- select_lag_maic(y)
print(lag_sel)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.