select_lag_maic: Select Optimal Lag using MAIC Criterion

View source: R/lag_selection.R

select_lag_maicR Documentation

Select Optimal Lag using MAIC Criterion

Description

Selects the optimal number of lags for the ADF regression using the Modified Akaike Information Criterion (MAIC) of Ng and Perron (2001).

Usage

select_lag_maic(y, maxlag = NULL, detrend = "constant")

Arguments

y

Numeric vector. Time series data.

maxlag

Integer or NULL. Maximum lag to consider. If NULL, uses the rule floor(12 * (T/100)^0.25).

detrend

Character. Detrending method: "constant" (demean) or "none". Default is "constant".

Details

The MAIC criterion is defined as:

MAIC(k) = \ln(\hat{\sigma}^2_k) + 2(k+1)/T

where \hat{\sigma}^2_k is the residual variance from the ADF regression with k lags.

This criterion provides better size properties than standard AIC for unit root testing.

Value

A list with class "lag_selection" containing:

selected_lag

Optimal lag selected by MAIC

maic

MAIC value at optimal lag

all_maic

Vector of MAIC values for all lags

maxlag

Maximum lag considered

References

Ng, S., & Perron, P. (2001). Lag length selection and the construction of unit root tests with good size and power. Econometrica, 69(6), 1519-1554. \Sexpr[results=rd]{tools:::Rd_expr_doi("10.1111/1468-0262.00256")}

Examples

# Generate random walk
set.seed(123)
y <- cumsum(rnorm(200))

# Select lag
lag_sel <- select_lag_maic(y)
print(lag_sel)


boundedur documentation built on March 16, 2026, 5:08 p.m.