vcr.svm.newdata: Prepare for visualization of a support vector machine...

vcr.svm.newdataR Documentation

Prepare for visualization of a support vector machine classification on new data.

Description

Carries out a support vector machine classification of new data using the output of vcr.svm.train on the training data, and computes the quantities needed for its visualization.

Usage

vcr.svm.newdata(Xnew, ynew = NULL, vcr.svm.train.out)

Arguments

Xnew

data matrix of the new data, with the same number of columns as in the training data. Missing values in Xnew are not allowed.

ynew

factor with class membership of each new case. Can be NA for some or all cases. If NULL, is assumed to be NA everywhere.

vcr.svm.train.out

output of vcr.svm.train on the training data.

Value

A list with components:

yintnew

number of the given class of each case. Can contain NA's.

ynew

given class label of each case. Can contain NA's.

levels

levels of the response, from vcr.svm.train.out.

predint

predicted class number of each case. Always exists.

pred

predicted label of each case.

altint

number of the alternative class. Among the classes different from the given class, it is the one with the highest posterior probability. Is NA for cases whose ynew is missing.

altlab

alternative label if yintnew was given, else NA.

PAC

probability of the alternative class. Is NA for cases whose ynew is missing.

fig

distance of each case i from each class g. Always exists.

farness

farness of each case from its given class. Is NA for cases whose ynew is missing.

ofarness

for each case i, its lowest fig[i,g] to any class g. Always exists.

Author(s)

Raymaekers J., Rousseeuw P.J.

References

Raymaekers J., Rousseeuw P.J., Hubert M. (2021). Class maps for visualizing classification results. Technometrics, appeared online. \Sexpr[results=rd]{tools:::Rd_expr_doi("10.1080/00401706.2021.1927849")}(link to open access pdf)

See Also

vcr.svm.train, classmap, silplot, stackedplot, e1071::svm

Examples


library(e1071)
set.seed(1); X <- matrix(rnorm(200 * 2), ncol = 2)
X[1:100, ] <- X[1:100, ] + 2
X[101:150, ] <- X[101:150, ] - 2
y <- as.factor(c(rep("blue", 150), rep("red", 50)))
# We now fit an SVM with radial basis kernel to the data:
set.seed(1) # to make the result of svm() reproducible.
svmfit <- svm(y~., data = data.frame(X = X, y = y),
scale = FALSE, kernel = "radial", cost = 10,
gamma = 1, probability = TRUE)
vcr.train <- vcr.svm.train(X, y, svfit = svmfit)
# As "new" data we take a subset of the training data:
inds <- c(1:25, 101:125, 151:175)
vcr.test <- vcr.svm.newdata(X[inds, ], y[inds], vcr.train)
plot(vcr.test$PAC, vcr.train$PAC[inds]); abline(0, 1) # match
plot(vcr.test$farness, vcr.train$farness[inds]); abline(0, 1)
confmat.vcr(vcr.test)
cols <- c("deepskyblue3", "red")
stackedplot(vcr.test, classCols = cols)
classmap(vcr.train, "blue", classCols = cols) # for comparison
classmap(vcr.test, "blue", classCols = cols)
classmap(vcr.train, "red", classCols = cols) # for comparison
classmap(vcr.test, "red", classCols = cols)


# For more examples, we refer to the vignette:
## Not run: 
vignette("Support_vector_machine_examples")

## End(Not run)

classmap documentation built on April 23, 2023, 5:09 p.m.