ts_rf: Random Forest

View source: R/ts_rf.R

ts_rfR Documentation

Random Forest

Description

Create a time series prediction object that uses Random Forest regression on sliding-window inputs.

It wraps the randomForest package to fit an ensemble of decision trees.

Usage

ts_rf(preprocess = NA, input_size = NA, nodesize = 1, ntree = 10, mtry = NULL)

Arguments

preprocess

Normalization preprocessor (e.g., ts_norm_gminmax()).

input_size

Integer. Number of lagged inputs used by the model.

nodesize

Integer. Minimum terminal node size.

ntree

Integer. Number of trees in the forest.

mtry

Integer. Number of variables randomly sampled at each split.

Details

Random Forests reduce variance by averaging many decorrelated trees. For tabular sliding-window features, they can capture nonlinearities and interactions without heavy feature engineering. Consider normalizing inputs for comparability across windows and tuning mtry, ntree, and nodesize.

Value

A ts_rf object (S3) inheriting from ts_regsw.

References

  • L. Breiman (2001). Random forests. Machine Learning, 45(1), 5–32.

Examples

# Example: sliding-window Random Forest
# Load tools and data
library(daltoolbox)
data(tsd)

# Turn series into 10-lag windows and preview
ts <- ts_data(tsd$y, 10)
ts_head(ts, 3)

# Train/test split and (X, y) projection
samp <- ts_sample(ts, test_size = 5)
io_train <- ts_projection(samp$train)
io_test <- ts_projection(samp$test)

# Define Random Forest and fit (tune ntree/mtry/nodesize as needed)
model <- ts_rf(ts_norm_gminmax(), input_size = 4, nodesize = 3, ntree = 50)
model <- fit(model, x = io_train$input, y = io_train$output)

# Forecast multiple steps and assess error
prediction <- predict(model, x = io_test$input[1,], steps_ahead = 5)
prediction <- as.vector(prediction)
output <- as.vector(io_test$output)

ev_test <- evaluate(model, output, prediction)
ev_test

tspredit documentation built on Feb. 11, 2026, 9:08 a.m.