lssvm | R Documentation |
The lssvm
function is an
implementation of the Least Squares SVM. lssvm
includes a
reduced version of Least Squares SVM using a decomposition of the
kernel matrix which is calculated by the csi
function.
## S4 method for signature 'formula'
lssvm(x, data=NULL, ..., subset, na.action = na.omit, scaled = TRUE)
## S4 method for signature 'vector'
lssvm(x, ...)
## S4 method for signature 'matrix'
lssvm(x, y, scaled = TRUE, kernel = "rbfdot", kpar = "automatic",
type = NULL, tau = 0.01, reduced = TRUE, tol = 0.0001,
rank = floor(dim(x)[1]/3), delta = 40, cross = 0, fit = TRUE,
..., subset, na.action = na.omit)
## S4 method for signature 'kernelMatrix'
lssvm(x, y, type = NULL, tau = 0.01,
tol = 0.0001, rank = floor(dim(x)[1]/3), delta = 40, cross = 0,
fit = TRUE, ...)
## S4 method for signature 'list'
lssvm(x, y, scaled = TRUE,
kernel = "stringdot", kpar = list(length=4, lambda = 0.5),
type = NULL, tau = 0.01, reduced = TRUE, tol = 0.0001,
rank = floor(dim(x)[1]/3), delta = 40, cross = 0, fit = TRUE,
..., subset)
x |
a symbolic description of the model to be fit, a matrix or
vector containing the training data when a formula interface is not
used or a |
data |
an optional data frame containing the variables in the model. By default the variables are taken from the environment which ‘lssvm’ is called from. |
y |
a response vector with one label for each row/component of |
scaled |
A logical vector indicating the variables to be
scaled. If |
type |
Type of problem. Either "classification" or "regression".
Depending on whether |
kernel |
the kernel function used in training and predicting. This parameter can be set to any function, of class kernel, which computes a dot product between two vector arguments. kernlab provides the most popular kernel functions which can be used by setting the kernel parameter to the following strings:
Setting the kernel parameter to "matrix" treats The kernel parameter can also be set to a user defined function of class kernel by passing the function name as an argument. |
kpar |
the list of hyper-parameters (kernel parameters). This is a list which contains the parameters to be used with the kernel function. For valid parameters for existing kernels are :
Hyper-parameters for user defined kernels can be passed through the
kpar parameter as well.
|
tau |
the regularization parameter (default 0.01) |
reduced |
if set to |
rank |
the maximal rank of the decomposed kernel matrix, see
|
delta |
number of columns of cholesky performed in advance, see
|
tol |
tolerance of termination criterion for the |
fit |
indicates whether the fitted values should be computed and included in the model or not (default: 'TRUE') |
cross |
if a integer value k>0 is specified, a k-fold cross validation on the training data is performed to assess the quality of the model: the Mean Squared Error for regression |
subset |
An index vector specifying the cases to be used in the training sample. (NOTE: If given, this argument must be named.) |
na.action |
A function to specify the action to be taken if |
... |
additional parameters |
Least Squares Support Vector Machines are reformulation to the
standard SVMs that lead to solving linear KKT systems.
The algorithm is based on the minimization of a classical penalized
least-squares cost function. The current implementation approximates
the kernel matrix by an incomplete Cholesky factorization obtained by
the csi
function, thus the solution is an approximation
to the exact solution of the lssvm optimization problem. The quality
of the solution depends on the approximation and can be influenced by
the "rank" , "delta", and "tol" parameters.
An S4 object of class "lssvm"
containing the fitted model,
Accessor functions can be used to access the slots of the object (see
examples) which include:
alpha |
the parameters of the |
coef |
the model coefficients (identical to alpha) |
b |
the model offset. |
xmatrix |
the training data used by the model |
Alexandros Karatzoglou
alexandros.karatzoglou@ci.tuwien.ac.at
J. A. K. Suykens and J. Vandewalle
Least Squares Support Vector Machine Classifiers
Neural Processing Letters vol. 9, issue 3, June 1999
ksvm
, gausspr
, csi
## simple example
data(iris)
lir <- lssvm(Species~.,data=iris)
lir
lirr <- lssvm(Species~.,data= iris, reduced = FALSE)
lirr
## Using the kernelMatrix interface
iris <- unique(iris)
rbf <- rbfdot(0.5)
k <- kernelMatrix(rbf, as.matrix(iris[,-5]))
klir <- lssvm(k, iris[, 5])
klir
pre <- predict(klir, k)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.