adaboost: Adaboost.M1 algorithm

Description Usage Arguments Details Value References See Also Examples

View source: R/adaboost_m1.R

Description

Implements Freund and Schapire's Adaboost.M1 algorithm

Usage

1

Arguments

formula

Formula for models

data

Input dataframe

nIter

no. of classifiers

...

other optional arguments, not implemented now

Details

This implements the Adaboost.M1 algorithm for a binary classification task. The target variable must be a factor with exactly two levels. The final classifier is a linear combination of weak decision tree classifiers.

Value

object of class adaboost

References

Freund, Y. and Schapire, R.E. (1996):“Experiments with a new boosting algorithm” . In Proceedings of the Thirteenth International Conference on Machine Learning, pp. 148–156, Morgan Kaufmann.

See Also

real_adaboost, predict.adaboost

Examples

1
2
3
fakedata <- data.frame( X=c(rnorm(100,0,1),rnorm(100,1,1)), Y=c(rep(0,100),rep(1,100) ) )
fakedata$Y <- factor(fakedata$Y)
test_adaboost <- adaboost(Y~X, data=fakedata,10)

Example output



fastAdaboost documentation built on May 2, 2019, 3:33 p.m.