DidacticBoost: A Simple Implementation and Demonstration of Gradient Boosting
Version 0.1.1

A basic, clear implementation of tree-based gradient boosting designed to illustrate the core operation of boosting models. Tuning parameters (such as stochastic subsampling, modified learning rate, or regularization) are not implemented. The only adjustable parameter is the number of training rounds. If you are looking for a high performance boosting implementation with tuning parameters, consider the 'xgboost' package.

Browse man pages Browse package API and functions Browse package files

AuthorDavid Shaub [aut, cre]
Date of publication2016-04-19 08:11:59
MaintainerDavid Shaub <davidshaub@gmx.com>
LicenseGPL-3
Version0.1.1
URL https://github.com/dashaub/DidacticBoost
Package repositoryView on CRAN
InstallationInstall the latest version of this package by entering the following in R:
install.packages("DidacticBoost")

Man pages

fitBoosted: Simple Gradient Boosting
is.boosted: Is the Object a Boosted Model
predict.boosted: Model Predictions

Functions

fitBoosted Man page Source code
is.boosted Man page Source code
predict.boosted Man page Source code

Files

tests
tests/testthat.R
tests/testthat
tests/testthat/test-Main.R
NAMESPACE
R
R/Main.R
MD5
DESCRIPTION
man
man/fitBoosted.Rd
man/predict.boosted.Rd
man/is.boosted.Rd
DidacticBoost documentation built on May 19, 2017, 10:22 p.m.