knitr::opts_chunk$set( collapse = TRUE, comment = "#>", fig.align="center", fig.height=4, fig.width=6, fig.path='Figs/', fig.show='hold', warning=FALSE, message=FALSE )
This vignette explains briefly how to use the function adam()
and the related auto.adam()
in smooth
package. It does not aim at covering all aspects of the function, but focuses on the main ones.
ADAM is Advanced Dynamic Adaptive Model. It is a model that underlies ETS, ARIMA and regression, connecting them in a unified framework. The underlying model for ADAM is a Single Source of Error state space model, which will be explained in detail separately in a bookdown file (under construction at the moment).
The main philosophy of adam()
function is to be agnostic of the provided data. This means that it will work with ts
, msts
, zoo
, xts
, data.frame
, numeric
and other classes of data. The specification of seasonality in the model is done using a separate parameter lags
, so you are not obliged to transform the existing data to something specific, and can use it as is. If you provide a matrix
, or a data.frame
, or a data.table
, or any other multivariate structure, then the function will use the first column for the response variable and the others for the explanatory ones. One thing that is currently assumed in the function is that the data is measured at a regular frequency. If this is not the case, you will need to introduce missing values manually.
In order to run the experiments in this vignette, we need to load the following packages:
require(greybox) require(smooth) require(forecast) require(Mcomp)
First and foremost, ADAM implements ETS model, although in a more flexible way than [@Hyndman2008b]: it supports different distributions for the error term, which are regulated via distribution
parameter. By default, the additive error model relies on Normal distribution, while the multiplicative error one asumes Inverse Gaussian. If you want to reproduce the classical ETS, you would need to specify distribution="dnorm"
. Here is an example of ADAM ETS(MMM) with Normal distribution on a N2568 data from M3 competition (if you provide an Mcomp
object, adam()
will automatically set the train and test sets, the forecast horizon and even the needed lags):
testModel <- adam(M3[[2568]], "MMM", lags=c(1,12), distribution="dnorm") summary(testModel) plot(forecast(testModel,h=18,interval="parametric"))
You might notice that the summary contains more than what is reported by other smooth
functions. This one also produces standard errors for the estimated parameters based on Fisher Information calculation. Note that this is computationally expensive, so if you have a model with more than 30 variables, the calculation of standard errors might take plenty of time. As for the default print()
method, it will produce a shorter summary from the model, without the standard errors (similar to what es()
does):
testModel
Also, note that the prediction interval in case of multiplicative error models are approximate. It is advisable to use simulations instead (which is slower, but more accurate):
plot(forecast(testModel,h=18,interval="simulated"))
If you want to do the residuals diagnostics, then it is recommended to use plot
function, something like this (you can select, which of the plots to produce):
par(mfcol=c(3,4)) plot(testModel,which=c(1:11)) par(mfcol=c(1,1)) plot(testModel,which=12)
By default ADAM will estimate models via maximising likelihood function. But there is also a parameter loss
, which allows selecting from a list of already implemented loss functions (again, see documentation for adam()
for the full list) or using a function written by a user. Here is how to do the latter on the example of another M3 series:
lossFunction <- function(actual, fitted, B){ return(sum(abs(actual-fitted)^3)) } testModel <- adam(M3[[1234]], "AAN", silent=FALSE, loss=lossFunction) testModel
Note that you need to have parameters actual, fitted and B in the function, which correspond to the vector of actual values, vector of fitted values on each iteration and a vector of the optimised parameters.
loss
and distribution
parameters are independent, so in the example above, we have assumed that the error term follows Normal distribution, but we have estimated its parameters using a non-conventional loss because we can.
The model selection in ADAM ETS relies on information criteria and works correctly only available for the loss="likelihood"
. There are several options, how to select the model, see them in the description of the function: ?adam
. The default one uses branch-and-bound algorithm, similar to the one used in es()
, but only considers additive trend models (the multiplicative trend ones are less stable and need more attention from a forecaster):
testModel <- adam(M3[[2568]], "ZXZ", lags=c(1,12), silent=FALSE) testModel
Note that the function produces point forecasts if h>0
, but it won't generate predictin interval. This is why you need to use forecast()
method (as shown in a previous example).
Similarly to es()
function supports combination of models, but it saves all the tested models in the output for a potential reuse. Here how it works:
testModel <- adam(M3[[2568]], "CXC", lags=c(1,12)) testForecast <- forecast(testModel,h=18,interval="semiparametric", level=c(0.9,0.95)) testForecast plot(testForecast)
Yes, now we support vectors for the levels in case you want to produce several. In fact, we also support side for prediction interval, so you can extract specific quantiles without a hustle:
forecast(testModel,h=18,interval="semiparametric", level=c(0.9,0.95,0.99), side="upper")
A brand new thing in the function is the possibility to use several frequency (double / triple / quadruple / ... seasonal models). Here what we can have in case of half-hourly data:
testModel <- adam(forecast::taylor, "MMdM", lags=c(1,48,336), silent=FALSE, h=336, holdout=TRUE) testModel
Note that the more lags you have, the more initial seasonal components the function will need to estimate, which is a difficult task. The optimiser might not get close to the optimal value, so we can help it. First, we can give more time for the calculation, increasing the number of iterations via maxeval
(the default value is 20 iterations for each optimised parameter. So, in case of the previous model it is 389*20=7780):
testModel <- adam(forecast::taylor, "MMdM", lags=c(1,48,336), silent=FALSE, h=336, holdout=TRUE, maxeval=10000) testModel
This will take more time, but will typically lead to more refined parameters. You can control other parameters of the optimiser as well, such as algorithm
, xtol_rel
, print_level
and others, which are explained in the documentation for nloptr
function from nloptr package (run nloptr.print.options()
for details). Second, we can give a different set of initial parameters for the optimiser, have a look at what the function saves:
testModel$B
and use this as a starting point (e.g. with a different algorithm):
testModel <- adam(forecast::taylor, "MMdM", lags=c(1,48,336), silent=FALSE, h=336, holdout=TRUE, B=testModel$B) testModel
Finally, we can speed up the process by using a different initialisation of the state vector, such as backcasting:
testModel <- adam(forecast::taylor, "MMdM", lags=c(1,48,336), silent=FALSE, h=336, holdout=TRUE, initial="b")
The result might be less accurate than in case of the optimisation, but it should be faster.
In addition, you can specify some parts of the initial state vector or some parts of the persistence vector, here is an example:
testModel <- adam(forecast::taylor, "MMdM", lags=c(1,48,336), silent=TRUE, h=336, holdout=TRUE, initial=list(level=30000, trend=1), persistence=list(beta=0.1)) testModel
The function also handles intermittent data (the data with zeroes) and the data with missing values. This is partially covered in the vignette on the oes() function. Here is a simple example:
testModel <- adam(rpois(120,0.5), "MNN", silent=FALSE, h=12, holdout=TRUE, occurrence="odds-ratio") testModel
Finally, adam()
is faster than es()
function, because its code is more efficient and it uses a different optimisation algorithm by with more finely tuned parameters by default. Let's compare:
adamModel <- adam(M3[[2568]], "CCC", lags=c(1,12)) esModel <- es(M3[[2568]], "CCC") "adam:" adamModel "es():" esModel
As mentioned above, ADAM does not only contain ETS, it also contains ARIMA model, which is regulated via orders
parameter. If you want to have a pure ARIMA, you need to switch off ETS, which is done via model="NNN"
:
testModel <- adam(M3[[1234]], "NNN", silent=FALSE, orders=c(0,2,2)) testModel
Given that both models are implemented in the same framework, they can be compared using information criteria.
The functionality of ADAM ARIMA is similar to the one of msarima
function in smooth
package, although there are several differences.
First, changing the distribution
parameter will allow switching between additive / multiplicative models. For example, distribution="dlnorm"
will create an ARIMA, equivalent to the one on logarithms of the data:
testModel <- adam(M3[[2568]], "NNN", silent=FALSE, lags=c(1,12), orders=list(ar=c(1,1),i=c(1,1),ma=c(2,2)), distribution="dlnorm") testModel
Second, it does not have intercept. If you want to have one, you can do this reintroducing ETS component and imposing some restrictions:
testModel <- adam(M3[[2568]], "ANN", silent=FALSE, persistence=0, lags=c(1,12), orders=list(ar=c(1,1),i=c(1,1),ma=c(2,2)), distribution="dnorm") testModel
This way we get the global level, which acts as an intercept. The drift is not supported in the model either.
Third, you can specify parameters of ARIMA via the arma
parameter in the following manner:
testModel <- adam(M3[[2568]], "NNN", silent=FALSE, lags=c(1,12), arma=list(ar=c(0.1,0.1), ma=c(-0.96, 0.03, -0.12, 0.03)), orders=list(ar=c(1,1),i=c(1,1),ma=c(2,2)), distribution="dnorm") testModel
Finally, the initials for the states can also be provided, although getting the correct ones might be a challenging task (you also need to know how many of them to provide; checking testModel$initial
might help):
testModel <- adam(M3[[2568]], "NNN", silent=FALSE, lags=c(1,12), initial=list(arima=M3[[2568]]$x[1:24]), orders=list(ar=c(1,1),i=c(1,1),ma=c(2,0)), distribution="dnorm") testModel
If you work with ADAM ARIMA model, then there is no such thing as "usual" bounds for the parameters, so the function will use the bounds="admissible"
, checking the AR / MA polynomials in order to make sure that the model is stationary and invertible (aka stable).
Similarly to ETS, you can use different distributions and losses for the estimation. Note that the order selection for ARIMA is done in auto.adam
function, not in the adam
!
Finally, ARIMA is typically slower than ETS, mainly because the maxeval
is set by default to 400. But this is inevitable due to an increased complexity of the model - otherwise it won't be estimated properly. If you want to speed things up, use initial="backcasting"
and reduce the number of iterations.
Another important feature of ADAM is introduction of explanatory variables. This is done in a similar way to es()
. Here is a brief example:
testModel <- adam(BJsales, "AAN", xreg=BJsales.lead, h=18, silent=FALSE)
If you work with data.frame or similar structures, then you can use them directly, ADAM will extract the response variable either assuming that it is in the first column or from the provided formula (is you specify one via formula
parameter). Here is an example, where we create a matrix with lags and leads of an explanatory variable:
xreg <- cbind(as.data.frame(BJsales),as.data.frame(xregExpander(BJsales.lead,c(-7:7)))) colnames(xreg)[1] <- "y" testModel <- adam(xreg, "ANN", h=18, silent=FALSE, holdout=TRUE, formula=y~xLag1+xLag2+xLag3) testModel
Similarly to es()
, there is a support for variables selection via xregDo
parameter, which will then use stepwise()
function from greybox
package on the residuals of the model:
testModel <- adam(xreg, "ANN", h=18, silent=FALSE, holdout=TRUE, xregDo="select")
The same functionality is supported with ARIMA, so you can have, for example, ARIMAX(0,1,1), which is equivalent to ETSX(A,N,N):
testModel <- adam(xreg, "NNN", h=18, silent=FALSE, holdout=TRUE, xregDo="select", orders=c(0,1,1))
The two models might differ because they have different initialisation in the optimiser. It is possible to make them identical if the number of iterations is increased and the initial parameters are the same. Here is an example on what happens, when the two models have exactly the same parameters:
xreg <- xreg[,c("y",names(testModel$initial$xreg))]; testModel <- adam(xreg, "NNN", h=18, silent=TRUE, holdout=TRUE, orders=c(0,1,1), initial=testModel$initial, arma=testModel$arma) testModel names(testModel$initial)[1] <- names(testModel$initial)[[1]] <- "level" testModel2 <- adam(xreg, "ANN", h=18, silent=TRUE, holdout=TRUE, initial=testModel$initial, persistence=testModel$arma$ma+1) testModel2
Another feature of ADAM is the time varying parameters in the SSOE framework, which can be switched on via xregDo="adapt"
:
testModel <- adam(xreg, "ANN", h=18, silent=FALSE, holdout=TRUE, xregDo="adapt") testModel$persistence
Note that the default number of iterations might not be sufficient in order to get to close to the optimum of the function, so setting maxeval
to something bigger might help. If you want to explore, why the optimisation stopped, you can provide print_level=41
parameter to the function, and it will print out the report from the optimiser. In the end, the default parameters are tuned in order to give a reasonable solution, but given the complexity of the model, they might not guarantee to give the best one all the time.
Finally, you can produce a mixture of ETS, ARIMA and regression, by using the respective parameters, like this:
testModel <- adam(xreg, "AAN", h=18, silent=FALSE, holdout=TRUE, xregDo="use", orders=c(1,0,1)) summary(testModel)
This might be handy, when you explore a high frequency data, want to add calendar events, apply ETS and add AR/MA errors to it.
While the original adam()
function allows selecting ETS components and explanatory variables, it does not allow selecting the most suitable distribution and / or ARIMA components. This is what auto.adam()
function is for.
In order to do the selection of the most appropriate distribution, you need to provide a vector of those that you want to check:
testModel <- auto.adam(M3[[1234]], "XXX", silent=FALSE, distribution=c("dnorm","dlogis","dlaplace","ds")) testModel
This process can also be done in parallel on either the automatically selected number of cores (e.g. parallel=TRUE
) or on the specified by user (e.g. parallel=4
):
testModel <- auto.adam(M3[[1234]], "ZZZ", silent=FALSE, parallel=TRUE) testModel
If you want to add ARIMA or regression components, you can do it in the exactly the same way as for the adam()
function. Here is an example of ETS+ARIMA:
testModel <- auto.adam(M3[[1234]], "AAN", orders=list(ar=2,i=2,ma=2), silent=TRUE, parallel=TRUE) testModel
However, this way the function will just use ARIMA(2,2,2) and fit it together with ETS. If you want it to select the most appropriate ARIMA orders, you need to add parameter select=TRUE
to the list in orders
:
testModel <- auto.adam(M3[[1234]], "XXN", orders=list(ar=2,i=2,ma=2,select=TRUE), silent=FALSE, parallel=TRUE)
Knowing how to work with adam()
, you can use similar principles, when dealing with auto.adam()
. Just keep in mind that the provided persistence
, phi
, initial
, arma
and B
won't work, because this contradicts the idea of the model building idea.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.