Description Author(s) References Examples
fastAdaboost provides a blazingly fast implementation of both discrete and real adaboost algorithms, based on a C++ backend. The goal of the package is to provide fast performance for large in-memory data sets.
Sourav Chatterjee
Freund, Y. and Schapire, R.E. (1996):“Experiments with a new boosting algorithm” . In Proceedings of the Thirteenth International Conference on Machine Learning, pp. 148–156, Morgan Kaufmann.
Zhu, Ji, et al. “Multi-class adaboost” Ann Arbor 1001.48109 (2006): 1612.
1 2 3 4 5 |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.