ts_knn: KNN Time Series Prediction

View source: R/ts_knn.R

ts_knnR Documentation

KNN Time Series Prediction

Description

Create a prediction object that uses the K-Nearest Neighbors regression for time series via sliding windows.

Usage

ts_knn(preprocess = NA, input_size = NA, k = NA)

Arguments

preprocess

Normalization preprocessor (e.g., ts_norm_gminmax()).

input_size

Integer. Number of lagged inputs.

k

Integer. Number of neighbors.

Details

KNN regression predicts a value as the average (or weighted average) of the outputs of the k most similar windows in the training set. Similarity is computed in the feature space induced by lagged inputs. Consider normalization for distance-based methods.

Value

A ts_knn object (S3) inheriting from ts_regsw.

References

  • T. M. Cover and P. E. Hart (1967). Nearest neighbor pattern classification. IEEE Transactions on Information Theory, 13(1), 21–27.

Examples

# Example: distance-based regression on sliding windows
# Load tools and example series
library(daltoolbox)
data(tsd)

# Build 10-lag windows and preview a few rows
ts <- ts_data(tsd$y, 10)
ts_head(ts, 3)

# Split end of series as test and project (X, y)
samp <- ts_sample(ts, test_size = 5)
io_train <- ts_projection(samp$train)
io_test <- ts_projection(samp$test)

# Define KNN regressor and fit (distance-based; normalization recommended)
model <- ts_knn(ts_norm_gminmax(), input_size = 4, k = 3)
model <- fit(model, x = io_train$input, y = io_train$output)

# Predict multiple steps ahead and evaluate
prediction <- predict(model, x = io_test$input[1,], steps_ahead = 5)
prediction <- as.vector(prediction)
output <- as.vector(io_test$output)

ev_test <- evaluate(model, output, prediction)
ev_test

tspredit documentation built on Feb. 11, 2026, 9:08 a.m.