cv.ccsvm: Cross-validation for ccsvm

View source: R/cv.ccsvm.R

cv.ccsvmR Documentation

Cross-validation for ccsvm

Description

Does k-fold cross-validation for ccsvm

Usage

## S3 method for class 'formula'
cv.ccsvm(formula, data, weights, contrasts=NULL, ...)
## S3 method for class 'matrix'
cv.ccsvm(x, y, weights, ...)
## Default S3 method:
cv.ccsvm(x,  ...)

Arguments

formula

symbolic description of the model, see details.

data

argument controlling formula processing via model.frame.

x

x matrix as in ccsvm.

y

response y as in ccsvm.

weights

Observation weights; defaults to 1 per observation

contrasts

the contrasts corresponding to levels from the respective models.

...

Other arguments that can be passed to ccsvm.

Details

Does a K-fold cross-validation to determine optimal tuning parameters in SVM: cost and gamma if kernel is nonlinear. It can also choose s used in cfun.

Value

An object contains a list of ingredients of cross-validation including optimal tuning parameters.

residmat

matrix with row values for kernel="linear" are s, cost, error, k, where k is the number of cross-validation fold. For nonlinear kernels, row values are s, gamma, cost, error, k.

cost

a value of cost that gives minimum cross-validated value in ccsvm.

gamma

a value of gamma that gives minimum cross-validated value in ccsvm

s

value of s for cfun that gives minimum cross-validated value in ccsvm.

Author(s)

Zhu Wang <wangz1@uthscsa.edu>

References

Zhu Wang (2020) Unified Robust Estimation, arXiv e-prints, https://arxiv.org/abs/2010.02848

See Also

ccsvm

Examples

## Not run: 
x <- matrix(rnorm(40*2), ncol=2)
y <- c(rep(-1, 20), rep(1, 20))
x[y==1,] <- x[y==1, ] + 1
ccsvm.opt <- cv.ccsvm(x, y, type="C-classification", s=1, kernel="linear", cfun="acave")
ccsvm.opt$cost
ccsvm.opt$gamma
ccsvm.opt$s

## End(Not run)

zhuwang46/mpath documentation built on March 21, 2022, 4:27 a.m.