cv.mhingebst: Cross-Validation for Multi-class Hinge Boosting

cv.mhingebstR Documentation

Cross-Validation for Multi-class Hinge Boosting

Description

Cross-validated estimation of the empirical multi-class hinge loss for boosting parameter selection.

Usage

cv.mhingebst(x, y, balance=FALSE, K = 10, cost = NULL, family = "hinge", 
learner = c("tree", "ls", "sm"), ctrl = bst_control(), 
type = c("loss","error"), plot.it = TRUE, main = NULL, se = TRUE, n.cores=2, ...)

Arguments

x

a data frame containing the variables in the model.

y

vector of responses. y must be integers from 1 to C for C class problem.

balance

logical value. If TRUE, The K parts were roughly balanced, ensuring that the classes were distributed proportionally among each of the K parts.

K

K-fold cross-validation

cost

price to pay for false positive, 0 < cost < 1; price of false negative is 1-cost.

family

family = "hinge" for hinge loss.

Implementing the negative gradient corresponding to the loss function to be minimized.

learner

a character specifying the component-wise base learner to be used: ls linear models, sm smoothing splines, tree regression trees.

ctrl

an object of class bst_control.

type

for family="hinge", type="loss" is hinge risk.

plot.it

a logical value, to plot the estimated loss or error with cross validation if TRUE.

main

title of plot

se

a logical value, to plot with standard errors.

n.cores

The number of CPU cores to use. The cross-validation loop will attempt to send different CV folds off to different cores.

...

additional arguments.

Value

object with

residmat

empirical risks in each cross-validation at boosting iterations

fraction

abscissa values at which CV curve should be computed.

cv

The CV curve at each value of fraction

cv.error

The standard error of the CV curve

...

See Also

mhingebst


bst documentation built on Jan. 7, 2023, 1:23 a.m.