details_svm_rbf_kernlab: Radial basis function support vector machines (SVMs) via...

details_svm_rbf_kernlabR Documentation

Radial basis function support vector machines (SVMs) via kernlab

Description

kernlab::ksvm() fits a support vector machine model. For classification, the model tries to maximize the width of the margin between classes. For regression, the model optimizes a robust loss function that is only affected by very large model residuals.

Details

For this engine, there are multiple modes: classification and regression

Tuning Parameters

This model has 3 tuning parameters:

  • cost: Cost (type: double, default: 1.0)

  • rbf_sigma: Radial Basis Function sigma (type: double, default: see below)

  • margin: Insensitivity Margin (type: double, default: 0.1)

There is no default for the radial basis function kernel parameter. kernlab estimates it from the data using a heuristic method. See kernlab::sigest(). This method uses random numbers so, without setting the seed before fitting, the model will not be reproducible.

Translation from parsnip to the original package (regression)

svm_rbf(
  cost = double(1),
  rbf_sigma = double(1), 
  margin = double(1)
) %>%  
  set_engine("kernlab") %>% 
  set_mode("regression") %>% 
  translate()
## Radial Basis Function Support Vector Machine Model Specification (regression)
## 
## Main Arguments:
##   cost = double(1)
##   rbf_sigma = double(1)
##   margin = double(1)
## 
## Computational engine: kernlab 
## 
## Model fit template:
## kernlab::ksvm(x = missing_arg(), data = missing_arg(), C = double(1), 
##     epsilon = double(1), kernel = "rbfdot", kpar = list(sigma = ~double(1)))

Translation from parsnip to the original package (classification)

svm_rbf(
  cost = double(1),
  rbf_sigma = double(1)
) %>% 
  set_engine("kernlab") %>% 
  set_mode("classification") %>% 
  translate()
## Radial Basis Function Support Vector Machine Model Specification (classification)
## 
## Main Arguments:
##   cost = double(1)
##   rbf_sigma = double(1)
## 
## Computational engine: kernlab 
## 
## Model fit template:
## kernlab::ksvm(x = missing_arg(), data = missing_arg(), C = double(1), 
##     kernel = "rbfdot", prob.model = TRUE, kpar = list(sigma = ~double(1)))

The margin parameter does not apply to classification models.

Note that the "kernlab" engine does not naturally estimate class probabilities. To produce them, the decision values of the model are converted to probabilities using Platt scaling. This method fits an additional model on top of the SVM model. When fitting the Platt scaling model, random numbers are used that are not reproducible or controlled by R’s random number stream.

Preprocessing requirements

Factor/categorical predictors need to be converted to numeric values (e.g., dummy or indicator variables) for this engine. When using the formula method via fit(), parsnip will convert factor columns to indicators.

Predictors should have the same scale. One way to achieve this is to center and scale each so that each predictor has mean zero and a variance of one.

Case weights

The underlying model implementation does not allow for case weights.

Saving fitted model objects

This model object contains data that are not required to make predictions. When saving the model for the purpose of prediction, the size of the saved object might be substantially reduced by using functions from the butcher package.

Examples

The “Fitting and Predicting with parsnip” article contains examples for svm_rbf() with the "kernlab" engine.

References


parsnip documentation built on Aug. 18, 2023, 1:07 a.m.