var_lm | R Documentation |
This function fits VAR(p) using OLS method.
var_lm(y, p = 1, include_mean = TRUE, method = c("nor", "chol", "qr"))
## S3 method for class 'varlse'
print(x, digits = max(3L, getOption("digits") - 3L), ...)
## S3 method for class 'varlse'
logLik(object, ...)
## S3 method for class 'varlse'
AIC(object, ...)
## S3 method for class 'varlse'
BIC(object, ...)
is.varlse(x)
is.bvharmod(x)
## S3 method for class 'varlse'
knit_print(x, ...)
y |
Time series data of which columns indicate the variables |
p |
Lag of VAR (Default: 1) |
include_mean |
Add constant term (Default: |
method |
Method to solve linear equation system.
( |
x |
Any object |
digits |
digit option to print |
... |
not used |
object |
A |
This package specifies VAR(p) model as
Y_{t} = A_1 Y_{t - 1} + \cdots + A_p Y_{t - p} + c + \epsilon_t
If include_type = TRUE
, there is constant term.
The function estimates every coefficient matrix.
Consider the response matrix Y_0
.
Let T
be the total number of sample,
let m
be the dimension of the time series,
let p
be the order of the model,
and let n = T - p
.
Likelihood of VAR(p) has
Y_0 \mid B, \Sigma_e \sim MN(X_0 B, I_s, \Sigma_e)
where X_0
is the design matrix,
and MN is matrix normal distribution.
Then log-likelihood of vector autoregressive model family is specified by
\log p(Y_0 \mid B, \Sigma_e) = - \frac{nm}{2} \log 2\pi - \frac{n}{2} \log \det \Sigma_e - \frac{1}{2} tr( (Y_0 - X_0 B) \Sigma_e^{-1} (Y_0 - X_0 B)^T )
In addition, recall that the OLS estimator for the matrix coefficient matrix is the same as MLE under the Gaussian assumption.
MLE for \Sigma_e
has different denominator, n
.
\hat{B} = \hat{B}^{LS} = \hat{B}^{ML} = (X_0^T X_0)^{-1} X_0^T Y_0
\hat\Sigma_e = \frac{1}{s - k} (Y_0 - X_0 \hat{B})^T (Y_0 - X_0 \hat{B})
\tilde\Sigma_e = \frac{1}{s} (Y_0 - X_0 \hat{B})^T (Y_0 - X_0 \hat{B}) = \frac{s - k}{s} \hat\Sigma_e
Let \tilde{\Sigma}_e
be the MLE
and let \hat{\Sigma}_e
be the unbiased estimator (covmat
) for \Sigma_e
.
Note that
\tilde{\Sigma}_e = \frac{n - k}{n} \hat{\Sigma}_e
Then
AIC(p) = \log \det \Sigma_e + \frac{2}{n}(\text{number of freely estimated parameters})
where the number of freely estimated parameters is mk
, i.e. pm^2
or pm^2 + m
.
Let \tilde{\Sigma}_e
be the MLE
and let \hat{\Sigma}_e
be the unbiased estimator (covmat
) for \Sigma_e
.
Note that
\tilde{\Sigma}_e = \frac{n - k}{T} \hat{\Sigma}_e
Then
BIC(p) = \log \det \Sigma_e + \frac{\log n}{n}(\text{number of freely estimated parameters})
where the number of freely estimated parameters is pm^2
.
var_lm()
returns an object named varlse
class.
It is a list with the following components:
Coefficient Matrix
Fitted response values
Residuals
LS estimate for covariance matrix
Numer of Coefficients
Lag of VAR
Dimension of the data
Sample size used when training = totobs
- p
Total number of the observation
Matched call
Process: VAR
include constant term (const
) or not (none
)
Design matrix
Raw input
Multivariate response matrix
Solving method
Matched call
It is also a bvharmod
class.
Lütkepohl, H. (2007). New Introduction to Multiple Time Series Analysis. Springer Publishing.
Akaike, H. (1969). Fitting autoregressive models for prediction. Ann Inst Stat Math 21, 243-247.
Akaike, H. (1971). Autoregressive model fitting for control. Ann Inst Stat Math 23, 163-180.
Akaike H. (1974). A new look at the statistical model identification. IEEE Transactions on Automatic Control, vol. 19, no. 6, pp. 716-723.
Akaike H. (1998). Information Theory and an Extension of the Maximum Likelihood Principle. In: Parzen E., Tanabe K., Kitagawa G. (eds) Selected Papers of Hirotugu Akaike. Springer Series in Statistics (Perspectives in Statistics). Springer, New York, NY.
Gideon Schwarz. (1978). Estimating the Dimension of a Model. Ann. Statist. 6 (2) 461 - 464.
summary.varlse()
to summarize VAR model
# Perform the function using etf_vix dataset
fit <- var_lm(y = etf_vix, p = 2)
class(fit)
str(fit)
# Extract coef, fitted values, and residuals
coef(fit)
head(residuals(fit))
head(fitted(fit))
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.