Description Usage Arguments Details Value Note Author(s) References See Also Examples
Exact MLE for full AR as well as subset AR. Both subset ARp and subset ARz models are implemented. For subset ARp models the R function arima is used. For full AR and subset ARz models, algorithm of McLeod & Zhang (2006) is implemented. The LS algorithm for subset ARp is also available as an option.
1 |
z |
time series, vector or ts object. |
p |
p specifies the model. If length(p) is 1, an AR(p) is assumed and if p has length greater than 1, a subset ARp or ARz is assumed - the default is ARz. For example, to fit a subset model with lags 1 and 4 present, set p to c(1,4) or equivalently c(1,0,0,4). To fit a subset model with just lag 4, you must use p=c(0,0,0,4) since p=4 will fit a full AR(4). |
lag.max |
the residual autocorrelations are tabulated for lags 1, ..., lag.max. Also lag.max is used for the Ljung-Box portmanteau test. |
ARModel |
which subset model, ARz or ARp |
... |
optional arguments which are passed to
|
The exact MLE for AR(p) and subset ARz use methods described in McLeod and Zhang (2006). In addition the exact MLE for the mean can be computed using an iterative backfitting approach described in McLeod and Zhang (2008).
The subset ARp model can be fit by exact MLE using the R function arima
or by least-squares.
The default for lag.max is min(300, ceiling(length(z)/5))
A list with class name "FitAR" and components:
loglikelihood |
value of the loglikelihood |
phiHat |
coefficients in AR(p) – including 0's |
sigsqHat |
innovation variance estimate |
muHat |
estimate of the mean |
covHat |
covariance matrix of the coefficient estimates |
zetaHat |
transformed parameters, length(zetaHat) = \# coefficients estimated |
RacfMatrix |
residual autocorrelations and sd for lags 1, ..., lag.max |
LjungBox |
table of Ljung-Box portmanteau test statistics |
SubsetQ |
parameters in AR(p) – including 0's |
res |
innovation residuals, same length as z |
fits |
fitted values, same length as z |
pvec |
lags used in AR model |
demean |
TRUE if mean estimated otherwise assumed zero |
FitMethod |
"MLE" or "LS" |
IterationCount |
number of iterations in mean mle estimation |
convergence |
value returned by optim – should be 0 |
MLEMeanQ |
TRUE if mle for mean algorithm used |
ARModel |
"ARp" if FitARp used, otherwise "ARz" |
tsp |
tsp(z) |
call |
result from match.call() showing how the function was called |
ModelTitle |
description of model |
DataTitle |
returns attr(z,"title") |
z |
time series data input |
There are generic print, summary, coef and resid functions for class "FitAR".
It is somewhat surprising that in the 'ARp' subset autoregression quite different subsets may be chosen depending on the choice of 'lag.max'. For example, with the 'lynx' taking lag.max = 15, 20 produces subsets 1, 2, 4, 10, 11 and 1, 2, 10, 11 using the BIC. This also occurs even with the AIC. See sixth example below.
A.I. McLeod
McLeod, A.I. and Zhang, Y. (2006). Partial Autocorrelation Parameterization for Subset Autoregression. Journal of Time Series Analysis, 27, 599-612.
McLeod, A.I. and Zhang, Y. (2008a). Faster ARMA Maximum Likelihood Estimation, Computational Statistics and Data Analysis, 52-4, 2166-2176. DOI link: http://dx.doi.org/10.1016/j.csda.2007.07.020.
McLeod, A.I. and Zhang, Y. (2008b, Submitted). Improved Subset Autoregression: With R Package. Journal of Statistical Software.
FitARp
,
FitARz
,
GetFitARz
,
FitARp
,
GetFitARpMLE
,
RacfPlot
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 | #First example: fit exact MLE to AR(4)
set.seed(3323)
phi<-c(2.7607,-3.8106,2.6535,-0.9238)
z<-SimulateGaussianAR(phi,1000)
ans<-FitAR(z,4,MeanMLEQ=TRUE)
ans
coef(ans)
## Not run: #save time building package!
#Second example: compare with sample mean result
ans<-FitAR(z,4)
coef(ans)
#Third example: fit subset ARz and ARp models
z<-log(lynx)
FitAR(z, c(1,2,4,7,10,11))
#now obtain exact MLE for Mean as well
FitAR(z, c(1,2,4,7,10,11), MeanMLE=TRUE)
#subset ARp using exact MLE
FitAR(z, c(1,2,4,7,10,11), ARModel="ARp", MLEQ=TRUE)
#subset ARp using LS
FitAR(z, c(1,2,4,7,10,11), ARModel="ARp", MLEQ=FALSE)
#or
FitAR(z, c(1,2,4,7,10,11), ARModel="ARp")
#Fourth example: use UBIC model selection to fit subset models
z<-log(lynx)
#ARz case
p<-SelectModel(z,ARModel="ARz")[[1]]$p
ans1<-FitAR(z, p)
ans1
ans1$ARModel
#ARp case
p<-SelectModel(z,ARModel="ARp")[[1]]$p
ans2<-FitAR(z, p, ARModel="ARp")
ans2
ans2$ARModel
#Fifth example: fit a full AR(p) using AIC/BIC methods
z<-log(lynx)
#BIC
p<-SelectModel(z,ARModel="AR")[1,1]
ans1<-FitAR(z, p)
ans1
#AIC
p<-SelectModel(z, ARModel="AR", Criterion="AIC")[1,1]
ans2<-FitAR(z, p)
ans2
## End(Not run)
#Sixth Example: Subset autoregression depends on lag.max!
#Because least-squares is used, P=lag.max observations are
# are deleted. This causes different results depending on lag.max.
#This phenomenon does not happen with "ARz" subset models
#ARp models depend on lag.max
SelectModel(z,lag.max=15,ARModel="ARp", Criterion="BIC", Best=1)
SelectModel(z,lag.max=20,ARModel="ARp", Criterion="BIC", Best=1)
#ARz models do NOT depend in this way on lag.max.
#Obviously if some lags beyond the initial value of lag.max are
# found to be important, then there is a dependence but this
# is not a problem!
SelectModel(z,lag.max=15,ARModel="ARz", Criterion="BIC", Best=1)
SelectModel(z,lag.max=20,ARModel="ARz", Criterion="BIC", Best=1)
|
Loading required package: lattice
Loading required package: leaps
Loading required package: ltsa
Loading required package: bestglm
AR(4). MLE. Mean estimated using MLE
length of series = 1000 , number of parameters = 5
loglikelihood = -11.976 , AIC = 34 , BIC = 58.5
MLE sd Z-ratio
phi(1) 2.77553630 0.01204809 230.3715443
phi(2) -3.83157614 0.02692659 -142.2971263
phi(3) 2.66902922 0.02692659 99.1224429
phi(4) -0.92457753 0.01204809 -76.7406119
mu 0.05835087 0.10139560 0.5754774
MLE sd Z-ratio
phi(1) 2.7755460 0.01205046 230.327005
phi(2) -3.8315616 0.02693187 -142.268656
phi(3) 2.6689967 0.02693187 99.101781
phi(4) -0.9245466 0.01205046 -76.722941
mu 0.1027028 0.10140289 1.012819
AR(11). MLE. Mean estimated using the sample mean
length of series = 114 , number of parameters = 7
loglikelihood = 88.49 , AIC = -163 , BIC = -143.8 , UBIC = -132.2
AR(11). MLE. Mean estimated using MLE
length of series = 114 , number of parameters = 7
loglikelihood = 88.556 , AIC = -163.1 , BIC = -144 , UBIC = -132.4
AR(11). MLE.
length of series = 114 , number of parameters = 6
loglikelihood = 89.164 , AIC = -166.3 , BIC = -149.9 , UBIC = -137.6
AR(11). LS Fit.
length of series = 114 , number of parameters = 6
loglikelihood = 89.001 , AIC = -166 , BIC = -149.6 , UBIC = -137.3
AR(11). LS Fit.
length of series = 114 , number of parameters = 6
loglikelihood = 89.001 , AIC = -166 , BIC = -149.6 , UBIC = -137.3
AR(11). MLE. Mean estimated using the sample mean
length of series = 114 , number of parameters = 6
loglikelihood = 86.45 , AIC = -160.9 , BIC = -144.5 , UBIC = -132.2
[1] "ARz"
AR(11). LS Fit.
length of series = 114 , number of parameters = 5
loglikelihood = 89.006 , AIC = -168 , BIC = -154.3 , UBIC = -142.1
[1] "ARp"
AR(2). MLE. Mean estimated using the sample mean
length of series = 114 , number of parameters = 3
loglikelihood = 73.184 , AIC = -140.4 , BIC = -132.2
AR(11). MLE. Mean estimated using the sample mean
length of series = 114 , number of parameters = 12
loglikelihood = 91.678 , AIC = -159.4 , BIC = -126.5
[1] 1 2 4 10 11
[1] 1 2 5 9 10 11
[1] 1 2 7 10 11
[1] 1 2 7 10 11
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.