DidacticBoost: A Simple Implementation and Demonstration of Gradient Boosting

A basic, clear implementation of tree-based gradient boosting designed to illustrate the core operation of boosting models. Tuning parameters (such as stochastic subsampling, modified learning rate, or regularization) are not implemented. The only adjustable parameter is the number of training rounds. If you are looking for a high performance boosting implementation with tuning parameters, consider the 'xgboost' package.

Getting started

Package details

AuthorDavid Shaub [aut, cre]
MaintainerDavid Shaub <davidshaub@gmx.com>
LicenseGPL-3
Version0.1.1
URL https://github.com/dashaub/DidacticBoost
Package repositoryView on CRAN
Installation Install the latest version of this package by entering the following in R:
install.packages("DidacticBoost")

Try the DidacticBoost package in your browser

Any scripts or data that you put into this service are public.

DidacticBoost documentation built on May 2, 2019, 9:20 a.m.