Description Usage Arguments Value Author(s) References See Also Examples

Produces output for the purpose of constructing graphical displays such as the `classmap`

. The user first needs to run a support vector machine classification on the data by `e1071::svm`

, with the option `probability = TRUE`

. This classification can be with two or more classes. The output of `e1071::svm`

is then an argument to `vcr.svm.train`

. As `e1071::svm`

does not output the data itself, it needs to be given as well, in the arguments `X`

and `y`

.

1 | ```
vcr.svm.train(X, y, svfit, ortho = FALSE)
``` |

`X` |
matrix of data coordinates, as used in |

`y` |
factor with the given (observed) class labels. It is crucial that X and y are exactly the same as in the call to |

`svfit` |
an object returned by |

`ortho` |
If |

A list with components:

`yint` |
number of the given class of each case. Can contain |

`y` |
given class label of each case. Can contain |

`levels` |
levels of the response |

`predint` |
predicted class number of each case. Always exists. |

`pred` |
predicted label of each case. |

`altint` |
number of the alternative class. Among the classes different from the given class, it is the one with the highest posterior probability. Is |

`altlab` |
label of the alternative class. Is |

`PAC` |
probability of the alternative class. Is |

`figparams` |
parameters used in |

`fig` |
distance of each case |

`farness` |
farness of each case from its given class. Is |

`ofarness` |
for each case |

`svfit` |
as it was input, will be useful for new data. |

`X` |
the matrix of data coordinates from the arguments. This is useful for classifying new data. |

Raymaekers J., Rousseeuw P.J.

Raymaekers J., Rousseeuw P.J., Hubert M. (2021). Class maps for visualizing classification results. *Technometrics*, appeared online. doi: 10.1080/00401706.2021.1927849(link to open access pdf)

`vcr.knn.newdata`

, `classmap`

, `silplot`

, `stackedplot`

, `e1071::svm`

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 | ```
library(e1071)
set.seed(1); X <- matrix(rnorm(200 * 2), ncol = 2)
X[1:100, ] <- X[1:100, ] + 2
X[101:150, ] <- X[101:150, ] - 2
y <- as.factor(c(rep("blue", 150), rep("red", 50)))
cols <- c("deepskyblue3", "red")
plot(X, col = cols[as.numeric(y)], pch = 19)
# We now fit an SVM with radial basis kernel to the data:
set.seed(1) # to make the result of svm() reproducible.
svmfit <- svm(y~., data = data.frame(X = X, y = y),
scale = FALSE, kernel = "radial", cost = 10,
gamma = 1, probability = TRUE)
plot(svmfit$decision.values, col = cols[as.numeric(y)]); abline(h = 0)
# so the decision values separate the classes reasonably well.
plot(svmfit, data = data.frame(X = X, y = y), X.2~X.1, col = cols)
# The boundary is far from linear (but in feature space it is).
vcr.train <- vcr.svm.train(X, y, svfit = svmfit)
confmat.vcr(vcr.train)
stackedplot(vcr.train, classCols = cols)
classmap(vcr.train, "blue", classCols = cols)
classmap(vcr.train, "red", classCols = cols)
# For more examples, we refer to the vignette:
## Not run:
vignette("Support_vector_machine_examples")
## End(Not run)
``` |

Embedding an R snippet on your website

Add the following code to your website.

For more information on customizing the embed code, read Embedding Snippets.