constructModel: Construct an object of class BigVAR

View source: R/BigVARObjectClass.R

constructModelR Documentation

Construct an object of class BigVAR

Description

Construct an object of class BigVAR

Usage

constructModel(
  Y,
  p,
  struct,
  gran,
  h = 1,
  cv = "Rolling",
  verbose = TRUE,
  IC = TRUE,
  VARX = list(),
  T1 = floor(nrow(Y)/3),
  T2 = floor(2 * nrow(Y)/3),
  ONESE = FALSE,
  ownlambdas = FALSE,
  recursive = FALSE,
  dates = as.character(NULL),
  window.size = 0,
  separate_lambdas = FALSE,
  linear = FALSE,
  loss = "L2",
  rolling_oos = FALSE,
  model.controls = list()
)

Arguments

Y

T \times k multivariate time series or Y T \times (k+m) endogenous and exogenous series, respectively.

p

Predetermined maximal lag order (for modeled series).

struct

The choice of penalty structure (see details).

gran

vector of penalty parameter specifications.

h

Desired forecast horizon.

cv

Cross-validation approach, either 'Rolling' for rolling cross-validation or 'LOO' for leave-one-out cross-validation. 'None' for use with BigVAR.fit.

verbose

Verbose output while estimating.

IC

True or False: whether to include AIC and BIC benchmarks.

VARX

List containing VARX model specifications.

T1

Index of time series in which to start cross validation.

T2

Index of times series in which to start forecast evaluation.

ONESE

True or False: whether to use the 'One Standard Error Heuristic.'

ownlambdas

True or False: Indicator for user-supplied penalty parameters.

recursive

True or False: Indicator as to whether iterative multi-step predictions are desired in the VAR context if the forecast horizon is greater than 1.

dates

optional vector of dates corresponding to Y.

window.size

size of rolling window. If set to 0 an expanding window will be used.

separate_lambdas

indicator for separate penalty parameters for each time series (default FALSE).

linear

indicator for linearly decrementing penalty grid (FALSE is log-linear; default FALSE).

loss

Loss function to select penalty parameter (one of 'L1','L2','Huber')

rolling_oos

True or False: indicator to update the penalty parameter over the evaluation period (default False)

model.controls

named list of control parameters for BigVAR model estimation (see details).

Details

The choices for 'struct' are as follows

  • 'Basic' (Basic VARX-L)

  • 'BasicEN' (Elastic Net VARX-L)

  • 'Lag' (Lag Group VARX-L)

  • 'SparseLag' (Lag Sparse Group VARX-L)

  • 'OwnOther' (Own/Other Group VARX-L)

  • 'SparseOO' (Own/Other Sparse Group VARX-L)

  • 'EFX' (Endogenous First VARX-L)

  • 'HLAGC' (Componentwise HLAG)

  • 'HLAGOO' (Own/Other HLAG)

  • 'HLAGELEM' (Elementwise HLAG)

  • 'Tapered' (Lag weighted Lasso VAR)

  • 'BGR' (Bayesian Ridge Regression (cf. Banbura et al))

  • 'MCP' (Minimax Concave Penalty (cf. Breheny and Huang))

  • 'SCAD' (Smoothly Clipped Absolute Deviation Penalty (cf. Breheny and Huang))

The first number in the vector 'gran' specifies how deep to construct the penalty grid and the second specifies how many penalty parameters to use If ownlambas is set to TRUE, gran should contain the user-supplied penalty parameters.

VARX specifications consist of a named list with entry k denoting the series that are to be modeled and entry s to denote the maximal lag order for exogenous series.

As the capabilities of BigVAR have expanded, we have decided to consolidate parameters in the list model.controls. These parameters include:

  • 'alpha:' grid of candidate parameters for the alpha in the Basic Elastic Net, Sparse Lag, Sparse Own/Other VARX-L.

  • 'C:' vector of coefficients to shrink toward a random walk (if MN is TRUE).

  • 'delta:' parameter for Huber loss (default 2.5)

  • 'intercept:' option to fit an intercept, default TRUE

  • 'loss:' Loss function to select penalty parameter (one of 'L1','L2','Huber')

  • 'MN:' Minnesota Prior Indicator, default FALSE

  • 'RVAR:' option to refit based upon the support selected using the Relaxed-VAR procedure (default FALSE).

  • 'refit_fraction:' If RVAR is TRUE, proportional tradeoff between least squares fit and penalized fit (default 1).

  • 'tol:' optimization tolerance (default 1e-4)

The argument alpha is ignored unless the structure choice is 'SparseLag' or 'Lag.' By default 'alpha' is set to NULL and will be initialized as 1/(k+1) in cv.BigVAR and BigVAR.est. Any user supplied values must be between 0 and 1.

Note

The specifications 'Basic','BasicEN', 'Lag,' 'SparseLag,' 'SparseOO','OwnOther', 'MCP', and 'SCAD.' can accommodate both VAR and VARX models. EFX only applies to VARX models. 'HLAGC,' 'HLAGOO,' 'HLAGELEM,' and 'Tapered' can only be used with VAR models. Our implementation of the SCAD and MCP penalties is heavily influenced by the package ncvreg.

References

Banbura, Marta, Domenico Giannone, and Lucrezia Reichlin. 'Large Bayesian vector auto regressions.' Journal of Applied Econometrics 25.1 (2010): 71-92. Breheny P, Huang J (2011). “Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection.” Annals of Applied Statistics, 5(1), 232–253. Nicholson, William, I. Wilms, J. Bien, and D. S. Matteson. High dimensional forecasting via interpretable vector autoregression. Journal of Machine Learning Research, 21(166):1–52, 2020. William B. Nicholson, David S. Matteson, Jacob Bien,VARX-L: Structured regularization for large vector autoregressions with exogenous variables, International Journal of Forecasting, Volume 33, Issue 3, 2017, Pages 627-651, William B Nicholson, David S. Matteson, and Jacob Bien (2016), 'BigVAR: Tools for Modeling Sparse High-Dimensional Multivariate Time Series' arxiv:1702.07094

See Also

cv.BigVAR,BigVAR.est

Examples

# VARX Example
# Create a Basic VARX-L with k=2, m=1, s=2, p=4
VARX=list()
VARX$k=2 # indicates that the first two series are modeled
VARX$s=2 # sets 2 as the maximal lag order for exogenous series
data(Y)
T1=floor(nrow(Y)/3)
T2=floor(2*nrow(Y)/3)
Model1=constructModel(Y,p=4,struct='Basic',gran=c(50,10),verbose=FALSE,VARX=VARX,T1=T1,T2=T2)

BigVAR documentation built on Jan. 9, 2023, 5:08 p.m.