fastBoost-package: a fast implementation of adaboost.M1 and real adaboost

Description Details Author(s) References See Also Examples

Description

This library implements a Rcpp based blazingly fast implementation of adaboost.m1 and real adaboost. This will be especially well suited for big datasets. The library currently supports decision trees as the weak classifier. Once the classifiers have been trained, they can be used to predict new datasets. Currently, we support only binary classification task. In addition to calculating the final error, a staged error is also calculated for each additional tree. This can be used to tune the final number of iterations. A plot of the staged error is also generated to help the user decide.

Details

The DESCRIPTION file: This package was not yet installed at build time.

Index: This package was not yet installed at build time.
~~ An overview of how to use the package, including the most important functions ~~

Author(s)

person("Sourav", "Chatterjee", , "souravc83@gmail.com", c("aut", "cre"))

Maintainer: Sourav Chatterjee <souravc83@gmail.com>

References

~~ Literature or other references for background information ~~

See Also

~~ Optional links to other man pages, e.g. ~~ ~~ <pkg> ~~

Examples

1
2
3
4
fakedata <- data.frame( X=c(rnorm(100,0,1),rnorm(100,1,1)), Y=c(rep(0,100),rep(1,100) ) )
fakedata$Y <- factor(fakedata$Y)
A <- adaboost(Y~X, fakedata, 10)
pred <- predict(A,newdata=fakedata)

souravc83/fastBoost documentation built on May 30, 2019, 6:34 a.m.