# makeFV: Constructs feature vectors from a kernel matrix. In classmap: Visualizing Classification Results

## Description

Constructs feature vectors from a kernel matrix.

## Usage

 `1` ```makeFV(kmat, transfmat = NULL, precS = 1e-12) ```

## Arguments

 `kmat` a kernel matrix. If `transfmat` is `NULL`, we are dealing with training data and then `kmat` must be a square kernel matrix (of size n by n when there are n cases). Such a PSD matrix kmat can e.g. be produced by `makeKernel` or by `kernlab::kernelMatrix`. If on the other hand `transfmat` is not `NULL`, we are dealing with a test set. See details for the precise working. `transfmat` transformation matrix. If not `NULL`, it is the value `transfmat` of `makeFV` on training data. It has to be a square matrix, with as many rows as there were training data. `precS` if not `NULL`, eigenvalues of `kmat` below `precS` will be set equal to precS.

## Details

If `transfmat` is non-`NULL`, we are dealing with a test set. Denote the number of cases in the test set by m ≥q 1. Each row of `kmat` of the test set then must contain the kernel values of a new case with all cases in the training set. Therefore the kernel matrix kmat must have dimensions m by n. The matrix `kmat` can e.g. be produced by `makeKernel`. It can also be obtained by running `kernlab::kernelMatrix` on the union of the training set and the test set, yielding an (n+m) by (n+m) matrix, from which one then takes the [(n+1):m , 1:n] submatrix.

## Value

A list with components:

 `Xf` When makeKV is applied to the training set, `Xf` has coordinates of n points (vectors), the plain inner products of which equal the kernel matrix of the training set. That is, `kmat` = `Xf` `Xf`'. The `Xf` are expressed in an orthogonal basis in which the variance of the coordinates is decreasing, which is useful when plotting the first few coordinates. When `makeFV` is applied to a test set, `Xf` are coordinates of the feature vectors of the test set in the same space as those of the training set, and then `kmat` = `Xf` `Xf_training`'. `transfmat` square matrix for transforming kmat to `Xf`. The actual transformation needs to be carried out by `makeFV` because it is not a simple matrix product.

## Author(s)

Raymaekers J., Rousseeuw P.J.

## References

Raymaekers J., Rousseeuw P.J., Hubert M. (2021). Class maps for visualizing classification results. Technometrics, appeared online. doi: 10.1080/00401706.2021.1927849(link to open access pdf)

`makeKernel`
 ``` 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28``` ```library(e1071) set.seed(1); X <- matrix(rnorm(200 * 2), ncol = 2) X[1:100, ] <- X[1:100, ] + 2 X[101:150, ] <- X[101:150, ] - 2 y <- as.factor(c(rep("blue", 150), rep("red", 50))) cols <- c("deepskyblue3", "red") plot(X, col = cols[as.numeric(y)], pch = 19) # We now fit an SVM with radial basis kernel to the data: svmfit <- svm(y~., data = data.frame(X = X, y = y), scale = FALSE, kernel = "radial", cost = 10, gamma = 1, probability = TRUE) Kxx <- makeKernel(X, svfit = svmfit) outFV <- makeFV(Kxx) Xf <- outFV\$Xf # The data matrix in this feature space. dim(Xf) # The feature vectors are high dimensional. # The inner products of Xf match the kernel matrix: max(abs(as.vector(Kxx - crossprod(t(Xf), t(Xf))))) # 6.167711e-11 # tiny, OK range(rowSums(Xf^2)) # all points in Xf lie on the unit sphere. pairs(Xf[, 1:5], col = cols[as.numeric(y)]) # In some of these we see spherical effects, e.g. plot(Xf[, 1], Xf[, 5], col = cols[as.numeric(y)], pch = 19) # The data look more separable here than in the original # two-dimensional space. # For more examples, we refer to the vignette: ## Not run: vignette("Support_vector_machine_examples") ## End(Not run) ```