tsmooth: Prediction and Interpolation of Time Series

View source: R/tsmooth.R

tsmoothR Documentation

Prediction and Interpolation of Time Series

Description

Predict and interpolate time series based on state space model by Kalman filter.

Usage

tsmooth(y, f, g, h, q, r, x0 = NULL, v0 = NULL, filter.end = NULL,
        predict.end = NULL, minmax = c(-1.0e+30, 1.0e+30), missed = NULL,
        np = NULL, plot = TRUE, ...)

Arguments

y

a univariate time series y_n.

f

state transition matrix F_n.

g

matrix G_n.

h

matrix H_n.

q

system noise variance Q_n.

r

observational noise variance R.

x0

initial state vector X(0\mid0).

v0

initial state covariance matrix V(0\mid0).

filter.end

end point of filtering.

predict.end

end point of prediction.

minmax

lower and upper limits of observations.

missed

start position of missed intervals.

np

number of missed observations.

plot

logical. If TRUE (default), mean vectors of the smoother and estimation error are plotted.

...

graphical arguments passed to plot.smooth.

Details

The linear Gaussian state space model is

x_n = F_n x_{n-1} + G_n v_n,

y_n = H_n x_n + w_n,

where y_n is a univariate time series, x_n is an m-dimensional state vector.

F_n, G_n and H_n are m \times m, m \times k matrices and a vector of length m , respectively. Q_n is k \times k matrix and R_n is a scalar. v_n is system noise and w_n is observation noise, where we assume that E(v_n, w_n) = 0, v_n \sim N(0, Q_n) and w_n \sim N(0, R_n). User should give all the matrices of a state space model and its parameters. In current version, F_n, G_n, H_n, Q_n, R_n should be time invariant.

Value

An object of class "smooth", which is a list with the following components:

mean.smooth

mean vectors of the smoother.

cov.smooth

variance of the smoother.

esterr

estimation error.

llkhood

log-likelihood.

aic

AIC.

References

Kitagawa, G. (2020) Introduction to Time Series Modeling with Applications in R. Chapman & Hall/CRC.

Kitagawa, G. and Gersch, W. (1996) Smoothness Priors Analysis of Time Series. Lecture Notes in Statistics, No.116, Springer-Verlag.

Examples

## Example of prediction (AR model)
data(BLSALLFOOD)
BLS120 <- BLSALLFOOD[1:120]
z1 <- arfit(BLS120, plot = FALSE)
tau2 <- z1$sigma2

# m = maice.order, k=1
m1 <- z1$maice.order
arcoef <- z1$arcoef[[m1]]
f <- matrix(0.0e0, m1, m1)
f[1, ] <- arcoef
if (m1 != 1)
  for (i in 2:m1) f[i, i-1] <- 1
g <- c(1, rep(0.0e0, m1-1))
h <- c(1, rep(0.0e0, m1-1))
q <- tau2[m1+1]
r <- 0.0e0
x0 <- rep(0.0e0, m1)
v0 <- NULL

s1 <- tsmooth(BLS120, f, g, h, q, r, x0, v0, filter.end = 120, predict.end = 156)
s1

plot(s1, BLSALLFOOD)

## Example of interpolation of missing values (AR model)
z2 <- arfit(BLSALLFOOD, plot = FALSE)
tau2 <- z2$sigma2

# m = maice.order, k=1
m2 <- z2$maice.order
arcoef <- z2$arcoef[[m2]]
f <- matrix(0.0e0, m2, m2)
f[1, ] <- arcoef
if (m2 != 1)
  for (i in 2:m2) f[i, i-1] <- 1
g <- c(1, rep(0.0e0, m2-1))
h <- c(1, rep(0.0e0, m2-1))
q <- tau2[m2+1]
r <- 0.0e0
x0 <- rep(0.0e0, m2)
v0 <- NULL

tsmooth(BLSALLFOOD, f, g, h, q, r, x0, v0, missed = c(41, 101), np = c(30, 20))

TSSS documentation built on Sept. 29, 2023, 9:07 a.m.