prediction.metrics: Prediction Metric Calculations

Description Usage Arguments Value See Also

Description

Performance evaluation of all fitted models. This function concisely provides model performance metrics, including confusion matrix and ROC.

Usage

1
2
prediction.metrics(finalModel, method, raw.data, inTrain, outTrain, features,
  bestTune, grp.levs, stability.metric)

Arguments

finalModel

List of fitted models

method

Vector of strings dictating the models that were fit

raw.data

Original dataset prior to any training subset

inTrain

List of training indicies for each feature selection run

outTrain

List of testing data indicies for each feature selection run

features

List of selected features for each model

bestTune

List of parameters that have been optimized for the each respective model

grp.levs

Vector of group levels

stability.metric

A character object specifying the stability metric

Value

Returns a dataframe consisting of each feature selection runs evaluated Accuracy, Kappa, ROC.AUC, Sensitivity, Specificity, Positive Predictive Value, and Negative Predictive Value.

See Also

performance.stats, perf.calc caret function confusionMatrix


cdeterman/OmicsMarkeR documentation built on May 13, 2019, 2:35 p.m.