mhingebst | R Documentation |
Gradient boosting for optimizing multi-class hinge loss functions with componentwise linear least squares, smoothing splines and trees as base learners.
mhingebst(x, y, cost = NULL, family = c("hinge"), ctrl = bst_control(), control.tree = list(fixed.depth=TRUE, n.term.node=6, maxdepth = 1), learner = c("ls", "sm", "tree")) ## S3 method for class 'mhingebst' print(x, ...) ## S3 method for class 'mhingebst' predict(object, newdata=NULL, newy=NULL, mstop=NULL, type=c("response", "class", "loss", "error"), ...) ## S3 method for class 'mhingebst' fpartial(object, mstop=NULL, newdata=NULL)
x |
a data frame containing the variables in the model. |
y |
vector of responses. |
cost |
equal costs for now and unequal costs will be implemented in the future. |
family |
|
ctrl |
an object of class |
control.tree |
control parameters of rpart. |
learner |
a character specifying the component-wise base learner to be used:
|
type |
in |
object |
class of |
newdata |
new data for prediction with the same number of columns as |
newy |
new response. |
mstop |
boosting iteration for prediction. |
... |
additional arguments. |
A linear or nonlinear classifier is fitted using a boosting algorithm based on component-wise base learners for multi-class responses.
An object of class mhingebst
with print
and predict
methods being available for fitted models.
Zhu Wang
Zhu Wang (2011), HingeBoost: ROC-Based Boost for Classification and Variable Selection. The International Journal of Biostatistics, 7(1), Article 13.
Zhu Wang (2012), Multi-class HingeBoost: Method and Application to the Classification of Cancer Types Using Gene Expression Data. Methods of Information in Medicine, 51(2), 162–7.
cv.mhingebst
for cross-validated stopping iteration. Furthermore see
bst_control
## Not run: dat <- ex1data(100, p=5) res <- mhingebst(x=dat$x, y=dat$y) ## End(Not run)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.