mada: Multi-class AdaBoost

View source: R/mada.R

madaR Documentation

Multi-class AdaBoost

Description

One-vs-all multi-class AdaBoost

Usage

mada(xtr, ytr, xte=NULL, yte=NULL, mstop=50, nu=0.1, interaction.depth=1)

Arguments

xtr

training data matrix containing the predictor variables in the model.

ytr

training vector of responses. ytr must be integers from 1 to C, for C class problem.

xte

test data matrix containing the predictor variables in the model.

yte

test vector of responses. yte must be integers from 1 to C, for C class problem.

mstop

number of boosting iteration.

nu

a small number (between 0 and 1) defining the step size or shrinkage parameter.

interaction.depth

used in gbm to specify the depth of trees.

Details

For a C-class problem (C > 2), each class is separately compared against all other classes with AdaBoost, and C functions are estimated to represent confidence for each class. The classification rule is to assign the class with the largest estimate.

Value

A list contains variable selected xselect and training and testing error err.tr, err.te.

Author(s)

Zhu Wang

See Also

cv.mada for cross-validated stopping iteration.

Examples

data(iris)
mada(xtr=iris[,-5], ytr=iris[,5])

bst documentation built on Jan. 7, 2023, 1:23 a.m.