Robust Boosting for Robust Loss Functions
Description
MM (majorization/minimization) algorithm based gradient boosting for optimizing nonconvex robust loss functions with componentwise linear, smoothing splines, tree models as base learners.
Usage
1 2 3  rbst(x, y, cost = 0.5, rfamily = c("tgaussian", "thuber","thinge", "tbinom", "binomd",
"texpo", "tpoisson", "clossR", "closs", "gloss", "qloss"), ctrl=bst_control(),
control.tree=list(maxdepth = 1), learner=c("ls","sm","tree"),del=1e10)

Arguments
x 
a data frame containing the variables in the model. 
y 
vector of responses. 
cost 
price to pay for false positive, 0 < 
rfamily 

ctrl 
an object of class 
control.tree 
control parameters of rpart. 
learner 
a character specifying the componentwise base learner to be used:

del 
convergency critera 
Details
An MM algorithm operates by creating a convex surrogate function that majorizes the nonconvex objective function. When the surrogate function is minimized with gradient boosting algorithm, the desired objective function is decreased. The MM algorithm contains difference of convex (DC) algorithm for rfamily=c("tgaussian", "thuber","thinge", "tbinom", "binomd", "texpo", "tpoisson")
and quadratic majorization boosting algorithm (QMBA) for rfamily=c("clossR", "closs", "gloss", "qloss")
.
Value
An object of class bst
with print
, coef
,
plot
and predict
methods are available for linear models.
For nonlinear models, methods print
and predict
are available.
x, y, cost, rfamily, learner, control.tree, maxdepth 
These are input variables and parameters 
ctrl 
the input 
yhat 
predicted function estimates 
ens 
a list of length 
ml.fit 
the last element of 
ensemble 
a vector of length 
xselect 
selected variables in 
coef 
estimated coefficients in 
Author(s)
Zhu Wang
See Also
cv.bst
for crossvalidated stopping iteration. Furthermore see
bst_control
Examples
1 2 3 4 5 6 7 8 9 10 11 12 13 14  x < matrix(rnorm(100*5),ncol=5)
c < 2*x[,1]
p < exp(c)/(exp(c)+exp(c))
y < rbinom(100,1,p)
y[y != 1] < 1
y[1:10] < y[1:10]
x < as.data.frame(x)
dat.m < bst(x, y, ctrl = bst_control(mstop=50), family = "hinge", learner = "ls")
predict(dat.m)
dat.m1 < bst(x, y, ctrl = bst_control(twinboost=TRUE,
coefir=coef(dat.m), xselect.init = dat.m$xselect, mstop=50))
dat.m2 < rbst(x, y, ctrl = bst_control(mstop=50, s=0, trace=TRUE),
rfamily = "thinge", learner = "ls")
predict(dat.m2)
