cv.mhingeova: Cross-Validation for one-vs-all HingeBoost with multi-class...

View source: R/mhingeova.R

cv.mhingeovaR Documentation

Cross-Validation for one-vs-all HingeBoost with multi-class problem

Description

Cross-validated estimation of the empirical misclassification error for boosting parameter selection.

Usage

cv.mhingeova(x, y, balance=FALSE, K=10, cost = NULL, nu=0.1, 
learner=c("tree", "ls", "sm"), maxdepth=1, m1=200, twinboost = FALSE, 
m2=200, trace=FALSE, plot.it = TRUE, se = TRUE, ...)

Arguments

x

a data frame containing the variables in the model.

y

vector of multi class responses. y must be an integer vector from 1 to C for C class problem.

balance

logical value. If TRUE, The K parts were roughly balanced, ensuring that the classes were distributed proportionally among each of the K parts.

K

K-fold cross-validation

cost

price to pay for false positive, 0 < cost < 1; price of false negative is 1-cost.

nu

a small number (between 0 and 1) defining the step size or shrinkage parameter.

learner

a character specifying the component-wise base learner to be used: ls linear models, sm smoothing splines, tree regression trees.

maxdepth

tree depth used in learner=tree

m1

number of boosting iteration

twinboost

logical: twin boosting?

m2

number of twin boosting iteration

trace

if TRUE, iteration results printed out

plot.it

a logical value, to plot the estimated risks if TRUE.

se

a logical value, to plot with standard errors.

...

additional arguments.

Value

object with

residmat

empirical risks in each cross-validation at boosting iterations

fraction

abscissa values at which CV curve should be computed.

cv

The CV curve at each value of fraction

cv.error

The standard error of the CV curve

...

Note

The functions for balanced cross validation were from R package pmar.

See Also

mhingeova


bst documentation built on Jan. 7, 2023, 1:23 a.m.