makeFV: Constructs feature vectors from a kernel matrix.

View source: R/VCR_auxiliaryFunctions.R

makeFVR Documentation

Constructs feature vectors from a kernel matrix.

Description

Constructs feature vectors from a kernel matrix.

Usage

makeFV(kmat, transfmat = NULL, precS = 1e-12)

Arguments

kmat

a kernel matrix. If transfmat is NULL, we are dealing with training data and then kmat must be a square kernel matrix (of size n by n when there are n cases). Such a PSD matrix kmat can e.g. be produced by makeKernel or by kernlab::kernelMatrix. If on the other hand transfmat is not NULL, we are dealing with a test set. See details for the precise working.

transfmat

transformation matrix. If not NULL, it is the value transfmat of makeFV on training data. It has to be a square matrix, with as many rows as there were training data.

precS

if not NULL, eigenvalues of kmat below precS will be set equal to precS.

Details

If transfmat is non-NULL, we are dealing with a test set. Denote the number of cases in the test set by m \geq 1. Each row of kmat of the test set then must contain the kernel values of a new case with all cases in the training set. Therefore the kernel matrix kmat must have dimensions m by n. The matrix kmat can e.g. be produced by makeKernel. It can also be obtained by running kernlab::kernelMatrix on the union of the training set and the test set, yielding an (n+m) by (n+m) matrix, from which one then takes the [(n+1):m , 1:n] submatrix.

Value

A list with components:

Xf

When makeKV is applied to the training set, Xf has coordinates of n points (vectors), the plain inner products of which equal the kernel matrix of the training set. That is, kmat = Xf Xf'. The Xf are expressed in an orthogonal basis in which the variance of the coordinates is decreasing, which is useful when plotting the first few coordinates. When makeFV is applied to a test set, Xf are coordinates of the feature vectors of the test set in the same space as those of the training set, and then kmat = Xf %*% Xf of training data.

transfmat

square matrix for transforming kmat to Xf.

Author(s)

Raymaekers J., Rousseeuw P.J., Hubert, M.

References

Raymaekers J., Rousseeuw P.J., Hubert M. (2021). Class maps for visualizing classification results. Technometrics, appeared online. \Sexpr[results=rd]{tools:::Rd_expr_doi("10.1080/00401706.2021.1927849")}(link to open access pdf)

See Also

makeKernel

Examples

library(e1071)
set.seed(1); X <- matrix(rnorm(200 * 2), ncol = 2)
X[1:100, ] <- X[1:100, ] + 2
X[101:150, ] <- X[101:150, ] - 2
y <- as.factor(c(rep("blue", 150), rep("red", 50)))
cols <- c("deepskyblue3", "red")
plot(X, col = cols[as.numeric(y)], pch = 19)
# We now fit an SVM with radial basis kernel to the data:
svmfit <- svm(y~.,  data = data.frame(X = X, y = y),  scale = FALSE,
             kernel = "radial", cost = 10, gamma = 1, probability = TRUE)
Kxx <- makeKernel(X, svfit = svmfit)
outFV <- makeFV(Kxx)
Xf <- outFV$Xf # The data matrix in this feature space.
dim(Xf) # The feature vectors are high dimensional.
# The inner products of Xf match the kernel matrix:
max(abs(as.vector(Kxx - crossprod(t(Xf), t(Xf))))) # 3.005374e-13 # tiny, OK
range(rowSums(Xf^2)) # all points in Xf lie on the unit sphere.
pairs(Xf[, 1:5], col = cols[as.numeric(y)])
# In some of these we see spherical effects, e.g.
plot(Xf[, 1], Xf[, 5], col = cols[as.numeric(y)], pch = 19)
# The data look more separable here than in the original
# two-dimensional space.

# For more examples, we refer to the vignette:
## Not run: 
vignette("Support_vector_machine_examples")

## End(Not run)

classmap documentation built on April 23, 2023, 5:09 p.m.