Description Usage Arguments Details Value Author(s) References See Also Examples
Training of kknn method via leave-one-out (train.kknn
) or k-fold (cv.kknn
) crossvalidation.
1 2 3 4 |
formula |
A formula object. |
data |
Matrix or data frame. |
kmax |
Maximum number of k, if |
ks |
A vector specifying values of k. If not null, this takes precedence over |
distance |
Parameter of Minkowski distance. |
kernel |
Kernel to use. Possible choices are "rectangular" (which is standard unweighted knn), "triangular", "epanechnikov" (or beta(2,2)), "biweight" (or beta(3,3)), "triweight" (or beta(4,4)), "cos", "inv", "gaussian" and "optimal". |
ykernel |
Window width of an y-kernel, especially for prediction of ordinal classes. |
scale |
logical, scale variable to have equal sd. |
contrasts |
A vector containing the 'unordered' and 'ordered' contrasts to use. |
... |
Further arguments passed to or from other methods. |
kcv |
Number of partitions for k-fold cross validation. |
train.kknn
performs leave-one-out crossvalidation
and is computatioanlly very efficient. cv.kknn
performs k-fold crossvalidation and is generally slower and does not yet contain the test of different models yet.
train.kknn
returns a list-object of class train.kknn
including
the components.
MISCLASS |
Matrix of misclassification errors. |
MEAN.ABS |
Matrix of mean absolute errors. |
MEAN.SQU |
Matrix of mean squared errors. |
fitted.values |
List of predictions for all combinations of kernel and k. |
best.parameters |
List containing the best parameter value for kernel and k. |
response |
Type of response variable, one of continuous, nominal or ordinal. |
distance |
Parameter of Minkowski distance. |
call |
The matched call. |
terms |
The 'terms' object used. |
Klaus P. Schliep klaus.schliep@gmail.com
Hechenbichler K. and Schliep K.P. (2004) Weighted k-Nearest-Neighbor Techniques and Ordinal Classification, Discussion Paper 399, SFB 386, Ludwig-Maximilians University Munich (https://doi.org/10.5282/ubm/epub.1769)
Hechenbichler K. (2005) Ensemble-Techniken und ordinale Klassifikation, PhD-thesis
Samworth, R.J. (2012) Optimal weighted nearest neighbour classifiers. Annals of Statistics, 40, 2733-2763. (avaialble from http://www.statslab.cam.ac.uk/~rjs57/Research.html)
kknn
and simulation
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 | library(kknn)
## Not run:
data(miete)
(train.con <- train.kknn(nmqm ~ wfl + bjkat + zh, data = miete,
kmax = 25, kernel = c("rectangular", "triangular", "epanechnikov",
"gaussian", "rank", "optimal")))
plot(train.con)
(train.ord <- train.kknn(wflkat ~ nm + bjkat + zh, miete, kmax = 25,
kernel = c("rectangular", "triangular", "epanechnikov", "gaussian",
"rank", "optimal")))
plot(train.ord)
(train.nom <- train.kknn(zh ~ wfl + bjkat + nmqm, miete, kmax = 25,
kernel = c("rectangular", "triangular", "epanechnikov", "gaussian",
"rank", "optimal")))
plot(train.nom)
## End(Not run)
data(glass)
glass <- glass[,-1]
(fit.glass1 <- train.kknn(Type ~ ., glass, kmax = 15, kernel =
c("triangular", "rectangular", "epanechnikov", "optimal"), distance = 1))
(fit.glass2 <- train.kknn(Type ~ ., glass, kmax = 15, kernel =
c("triangular", "rectangular", "epanechnikov", "optimal"), distance = 2))
plot(fit.glass1)
plot(fit.glass2)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.