Extensions to Freund and Schapire's AdaBoost algorithm, Y. Freund and R. Schapire (1997) <doi:10.1006/jcss.1997.1504> and Friedman's gradient boosting machine, J.H. Friedman (2001) <doi:10.1214/aos/1013203451>. Includes regression methods for least squares, absolute loss, t-distribution loss, quantile regression, logistic, Poisson, Cox proportional hazards partial likelihood, AdaBoost exponential loss, Huberized hinge loss, and Learning to Rank measures (LambdaMART).
Package details |
|
---|---|
Maintainer | |
License | GPL (>= 2) |
Version | 3.0 |
URL | https://github.com/gbm-developers/gbm3 |
Package repository | View on GitHub |
Installation |
Install the latest version of this package by entering the following in R:
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.