vcr.svm.newdata: Support vector machine classification of new data.

Description Usage Arguments Value Author(s) References See Also Examples

View source: R/VCR_SVM.R

Description

Carries out a support vector machine classification of new data using the output of vcr.svm.train on the training data, and computes the quantities needed for its visualization.

Usage

1
vcr.svm.newdata(Xnew, ynew=NULL, vcr.svm.train.out)

Arguments

Xnew

data matrix of the new data, with the same number of columns as in the training data. Missing values in Xnew are not allowed.

ynew

factor with class membership of each new case. Can be NA for some or all cases. If NULL, is assumed to be NA everywhere.

vcr.svm.train.out

output of vcr.svm.train on the training data.

Value

A list with components:

yintnew

number of the given class of each case. Can contain NA's.

ynew

given class label of each case. Can contain NA's.

levels

levels of the response, from vcr.svm.train.out.

predint

predicted class number of each case. Always exists.

pred

predicted label of each case.

altint

number of the alternative class. Among the classes different from the given class, it is the one with the highest posterior probability. Is NA for cases whose ynew is missing.

altlab

alternative label if yintnew was given, else NA.

PAC

probability of the alternative class. Is NA for cases whose ynew is missing.

figparams

(from training data) parameters used for fig.

fig

distance of each case i from each class g. Always exists.

farness

farness of each case from its given class. Is NA for cases whose ynew is missing.

ofarness

for each case i, its lowest fig[i,g] to any class g. Always exists.

Author(s)

Raymaekers J., Rousseeuw P.J.

References

Raymaekers J., Rousseeuw P.J., Hubert M. (2021). Class maps for visualizing classification results. Technometrics, forthcoming. (link to open access pdf)

See Also

vcr.svm.train, classmap, e1071::svm

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
library(e1071)
set.seed(1); X = matrix(rnorm(200*2),ncol=2)
X[1:100,] = X[1:100,]+2
X[101:150,] = X[101:150,]-2
y = as.factor(c(rep("blue",150),rep("red",50)))
# We now fit an SVM with radial basis kernel to the data:
set.seed(1) # to make the result of svm() reproducible.
svmfit = svm(y~.,data=data.frame(X=X,y=y),scale=FALSE,kernel="radial",
             cost=10, gamma=1, probability=TRUE)
vcr.train = vcr.svm.train(X, y, svfit=svmfit)
# As "new" data we take a subset of the training data:
inds = c(1:25,101:125,151:175)
vcr.test = vcr.svm.newdata(X[inds,],y[inds],vcr.train)
plot(vcr.test$PAC,vcr.train$PAC[inds]); abline(0,1) # match
plot(vcr.test$farness,vcr.train$farness[inds]); abline(0,1)
confmat.vcr(vcr.test)
cols = c("deepskyblue3","red")
stackedplot(vcr.test, classCols = cols)
classmap(vcr.train, "blue", classCols = cols) # for comparison
classmap(vcr.test, "blue", classCols = cols)
classmap(vcr.train, "red", classCols = cols) # for comparison
classmap(vcr.test, "red", classCols = cols)


# For more examples, we refer to the vignettes:
vignette("Support_vector_machine_examples")

classmap documentation built on May 10, 2021, 9:10 a.m.