adaBoost: Adaboost.M1 algorithm

Description Usage Arguments Details References See Also Examples

Description

Implements Freund and Schapire's Adaboost.M1 algorithm

Usage

1

Arguments

formula

Formula for models

data

Input dataframe

nIter

no. of classifiers

...

other optional arguments, not implemented now

Details

This implements the Adaboost.M1 algorithm for a binary classification task. The target variable must a a factor with exactly two levels. The final classifier is a linear combination of weak decision tree classifiers.

References

Freund, Y. and Schapire, R.E. (1996):“Experiments with a new boosting algorithm” . In Proceedings of the Thirteenth International Conference on Machine Learning, pp. 148–156, Morgan Kaufmann.

See Also

real_adaboost,predict.adaboost

Examples

1
2
3
fakedata <- data.frame( X=c(rnorm(100,0,1),rnorm(100,1,1)), Y=c(rep(0,100),rep(1,100) ) )
fakedata$Y <- factor(fakedata$Y)
test_adaboost <- adaboost(Y~X, data=fakedata,10)

souravc83/fastBoost documentation built on May 30, 2019, 6:34 a.m.