| ts_svm | R Documentation |
Create a time series prediction object that uses Support Vector Regression (SVR) on sliding-window inputs.
It wraps the e1071 package to fit epsilon-insensitive regression with
linear, radial, polynomial, or sigmoid kernels.
ts_svm(
preprocess = NA,
input_size = NA,
kernel = "radial",
epsilon = 0,
cost = 10
)
preprocess |
Normalization preprocessor (e.g., |
input_size |
Integer. Number of lagged inputs used by the model. |
kernel |
Character. One of 'linear', 'radial', 'polynomial', 'sigmoid'. |
epsilon |
Numeric. Epsilon-insensitive loss width. |
cost |
Numeric. Regularization parameter controlling margin violations. |
SVR aims to find a function with at most epsilon deviation from
each training point while being as flat as possible. The cost parameter
controls the trade-off between margin width and violations; epsilon
controls the insensitivity tube width. RBF kernels often work well for
nonlinear series; tune cost, epsilon, and kernel hyperparameters.
A ts_svm object (S3) inheriting from ts_regsw.
C. Cortes and V. Vapnik (1995). Support-Vector Networks. Machine Learning, 20, 273–297.
# Example: SVR with min–max normalization
# Load package and dataset
library(daltoolbox)
data(tsd)
# Create sliding windows and preview
ts <- ts_data(tsd$y, 10)
ts_head(ts, 3)
# Temporal split and (X, y) projection
samp <- ts_sample(ts, test_size = 5)
io_train <- ts_projection(samp$train)
io_test <- ts_projection(samp$test)
# Define SVM regressor and fit to training data
model <- ts_svm(ts_norm_gminmax(), input_size = 4)
model <- fit(model, x = io_train$input, y = io_train$output)
# Multi-step forecast and evaluation
prediction <- predict(model, x = io_test$input[1,], steps_ahead = 5)
prediction <- as.vector(prediction)
output <- as.vector(io_test$output)
ev_test <- evaluate(model, output, prediction)
ev_test
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.