adaboost: AdaBoost

View source: R/adaboost.R

adaboostR Documentation

AdaBoost

Description

An implementation of the AdaBoost.MH (Adaptive Boosting) algorithm for classification. This can be used to train an AdaBoost model on labeled data or use an existing AdaBoost model to predict the classes of new points.

Usage

adaboost(
  input_model = NA,
  iterations = NA,
  labels = NA,
  test = NA,
  tolerance = NA,
  training = NA,
  verbose = FALSE,
  weak_learner = NA
)

Arguments

input_model

Input AdaBoost model (AdaBoostModel).

iterations

The maximum number of boosting iterations to be run (0 will run until convergence.. Default value "1000" (integer).

labels

Labels for the training set (integer row).

test

Test dataset (numeric matrix).

tolerance

The tolerance for change in values of the weighted error during training. Default value "1e-10" (numeric).

training

Dataset for training AdaBoost (numeric matrix).

verbose

Display informational messages and the full list of parameters and timers at the end of execution. Default value "FALSE" (logical).

weak_learner

The type of weak learner to use: 'decision_stump', or 'perceptron'. Default value "decision_stump" (character).

Details

This program implements the AdaBoost (or Adaptive Boosting) algorithm. The variant of AdaBoost implemented here is AdaBoost.MH. It uses a weak learner, either decision stumps or perceptrons, and over many iterations, creates a strong learner that is a weighted ensemble of weak learners. It runs these iterations until a tolerance value is crossed for change in the value of the weighted training error.

For more information about the algorithm, see the paper "Improved Boosting Algorithms Using Confidence-Rated Predictions", by R.E. Schapire and Y. Singer.

This program allows training of an AdaBoost model, and then application of that model to a test dataset. To train a model, a dataset must be passed with the "training" option. Labels can be given with the "labels" option; if no labels are specified, the labels will be assumed to be the last column of the input dataset. Alternately, an AdaBoost model may be loaded with the "input_model" option.

Once a model is trained or loaded, it may be used to provide class predictions for a given test dataset. A test dataset may be specified with the "test" parameter. The predicted classes for each point in the test dataset are output to the "predictions" output parameter. The AdaBoost model itself is output to the "output_model" output parameter.

Note: the following parameter is deprecated and will be removed in mlpack 4.0.0: "output". Use "predictions" instead of "output".

Value

A list with several components:

output

Predicted labels for the test set (integer row).

output_model

Output trained AdaBoost model (AdaBoostModel).

predictions

Predicted labels for the test set (integer row).

probabilities

Predicted class probabilities for each point in the test set (numeric matrix).

Author(s)

mlpack developers

Examples

# For example, to run AdaBoost on an input dataset "data" with labels
# "labels"and perceptrons as the weak learner type, storing the trained model
# in "model", one could use the following command: 

## Not run: 
output <- adaboost(training=data, labels=labels, weak_learner="perceptron")
model <- output$output_model

## End(Not run)

# Similarly, an already-trained model in "model" can be used to provide class
# predictions from test data "test_data" and store the output in
# "predictions" with the following command: 

## Not run: 
output <- adaboost(input_model=model, test=test_data)
predictions <- output$predictions

## End(Not run)

mlpack documentation built on Oct. 29, 2022, 1:06 a.m.

Related to adaboost in mlpack...