KNN.acf: KNN.acf

View source: R/mstep.R

KNN.acfR Documentation

KNN.acf

Description

Multioutput KNN

Arguments

X:

training input [N,n]

Y:

training output [N,m]

X.ts:

test input [N.ts,n]

k:

min number of neighbours

dist:

type of distance: euclidean, cosine

F:

forgetting factor

C:

integer parameter which sets the maximum number of neighbours (Ck)

wta:

if TRUE a winner-takes-all strategy is used; otherwise a weigthed combination is done on the basis of the l-o-o error

Acf:

autocorrelation function of the training series

Reg:

number (>1) of null terms to regularise the mean

Details

KNN.acf

Multioutput KNN for multi-step-ahed prediction. It performs a locally constant model with weighted combination of local model on the basis of the dynamic properties of the training time series.

Value

vector of N.ts predictions

Author(s)

Gianluca Bontempi Gianluca.Bontempi@ulb.be

References

Bontempi G. Ben Taieb S. Conditionally dependent strategies for multiple-step-ahead prediction in local learning, International Journal of Forecasting Volume 27, Issue 3, July–September 2011, Pages 689–699

Examples

## Multi-step ahead time series forecasting
require(gbcode)
library(lazy)
t=seq(0,200,by=0.1)
N<-length(t)
H<-500 ## horizon prediction
TS<-sin(t)+rnorm(N,sd=0.1)
TS.tr=TS[1:(N-H)]
N.tr<-length(TS.tr)
TS.ts<-TS[(N-H+1):N]
n=3
TS.tr=array(TS.tr,c(length(TS.tr),1))
E=MakeEmbedded(TS.tr,n=n,delay=0,hor=H,1)
X<-E$inp
Y<-E$out
N<-NROW(X)
Y.cont<-KNN.acf(X,Y,rev(TS.tr[(N.tr-n+1):N.tr]),TS=TS.tr)
plot(t[(N-H+1):N],TS.ts)
lines(t[(N-H+1):N],Y.cont)

gbonte/gbcode documentation built on Aug. 30, 2024, 1:11 a.m.