mlr_learners_classif.kknn: k-Nearest-Neighbor Classification Learner

mlr_learners_classif.kknnR Documentation

k-Nearest-Neighbor Classification Learner

Description

k-Nearest-Neighbor classification. Calls kknn::kknn() from package kknn.

Initial parameter values

  • store_model:

    • See note.

Dictionary

This Learner can be instantiated via the dictionary mlr_learners or with the associated sugar function lrn():

mlr_learners$get("classif.kknn")
lrn("classif.kknn")

Meta Information

  • Task type: “classif”

  • Predict Types: “response”, “prob”

  • Feature Types: “logical”, “integer”, “numeric”, “factor”, “ordered”

  • Required Packages: mlr3, mlr3learners, kknn

Parameters

Id Type Default Levels Range
k integer 7 [1, \infty)
distance numeric 2 [0, \infty)
kernel character optimal rectangular, triangular, epanechnikov, biweight, triweight, cos, inv, gaussian, rank, optimal -
scale logical TRUE TRUE, FALSE -
ykernel untyped -
store_model logical FALSE TRUE, FALSE -

Super classes

mlr3::Learner -> mlr3::LearnerClassif -> LearnerClassifKKNN

Methods

Public methods

Inherited methods

Method new()

Creates a new instance of this R6 class.

Usage
LearnerClassifKKNN$new()

Method clone()

The objects of this class are cloneable with this method.

Usage
LearnerClassifKKNN$clone(deep = FALSE)
Arguments
deep

Whether to make a deep clone.

Note

There is no training step for k-NN models, just storing the training data to process it during the predict step. Therefore, ⁠$model⁠ returns a list with the following elements:

  • formula: Formula for calling kknn::kknn() during ⁠$predict()⁠.

  • data: Training data for calling kknn::kknn() during ⁠$predict()⁠.

  • pv: Training parameters for calling kknn::kknn() during ⁠$predict()⁠.

  • kknn: Model as returned by kknn::kknn(), only available after ⁠$predict()⁠ has been called. This is not stored by default, you must set hyperparameter store_model to TRUE.

References

Hechenbichler, Klaus, Schliep, Klaus (2004). “Weighted k-nearest-neighbor techniques and ordinal classification.” Technical Report Discussion Paper 399, SFB 386, Ludwig-Maximilians University Munich. \Sexpr[results=rd]{tools:::Rd_expr_doi("10.5282/ubm/epub.1769")}.

Samworth, J R (2012). “Optimal weighted nearest neighbour classifiers.” The Annals of Statistics, 40(5), 2733–2763. \Sexpr[results=rd]{tools:::Rd_expr_doi("10.1214/12-AOS1049")}.

Cover, Thomas, Hart, Peter (1967). “Nearest neighbor pattern classification.” IEEE transactions on information theory, 13(1), 21–27. \Sexpr[results=rd]{tools:::Rd_expr_doi("10.1109/TIT.1967.1053964")}.

See Also

Other Learner: mlr_learners_classif.cv_glmnet, mlr_learners_classif.glmnet, mlr_learners_classif.lda, mlr_learners_classif.log_reg, mlr_learners_classif.multinom, mlr_learners_classif.naive_bayes, mlr_learners_classif.nnet, mlr_learners_classif.qda, mlr_learners_classif.ranger, mlr_learners_classif.svm, mlr_learners_classif.xgboost, mlr_learners_regr.cv_glmnet, mlr_learners_regr.glmnet, mlr_learners_regr.kknn, mlr_learners_regr.km, mlr_learners_regr.lm, mlr_learners_regr.nnet, mlr_learners_regr.ranger, mlr_learners_regr.svm, mlr_learners_regr.xgboost

Examples

if (requireNamespace("kknn", quietly = TRUE)) {
# Define the Learner and set parameter values
learner = lrn("classif.kknn")
print(learner)

# Define a Task
task = tsk("sonar")

# Create train and test set
ids = partition(task)

# Train the learner on the training ids
learner$train(task, row_ids = ids$train)

# print the model
print(learner$model)

# importance method
if("importance" %in% learner$properties) print(learner$importance)

# Make predictions for the test rows
predictions = learner$predict(task, row_ids = ids$test)

# Score the predictions
predictions$score()
}

mlr3learners documentation built on Nov. 21, 2023, 5:07 p.m.