View source: R/model_metrics.R
model_metrics | R Documentation |
This function lets the user get a confusion matrix and accuracy, and for for binary classification models: AUC, Precision, Sensitivity, and Specificity, given the expected (tags) values and predicted values (scores).
model_metrics(
tag,
score,
multis = NA,
abc = TRUE,
thresh = 10,
auto_n = TRUE,
thresh_cm = 0.5,
target = "auto",
type = "test",
model_name = NA,
plots = TRUE,
quiet = FALSE,
subtitle = NA
)
tag |
Vector. Real known label |
score |
Vector. Predicted value or model's result |
multis |
Data.frame. Containing columns with each category score (only used when more than 2 categories coexist) |
abc |
Boolean. Arrange columns and rows alphabetically when categorical values? |
thresh |
Integer. Threshold for selecting binary or regression
models: this number is the threshold of unique values we should
have in |
auto_n |
Add |
thresh_cm |
Numeric. Value to splits the results for the confusion matrix. Range of values: (0-1) |
target |
Value. Which is your target positive value? If
set to |
type |
Character. One of: "train", "test". |
model_name |
Character. Model's name for reference. |
plots |
Boolean. Create plots objects? |
quiet |
Boolean. Quiet all messages, warnings, recommendations? |
subtitle |
Character. Subtitle for plots |
List. Multiple performance metrics that vary depending on
the type of model (classification or regression). If plot=TRUE
,
multiple plots are also returned.
Other Machine Learning:
ROC()
,
conf_mat()
,
export_results()
,
gain_lift()
,
h2o_automl()
,
h2o_predict_MOJO()
,
h2o_selectmodel()
,
impute()
,
iter_seeds()
,
lasso_vars()
,
model_preprocess()
,
msplit()
Other Model metrics:
ROC()
,
conf_mat()
,
errors()
,
gain_lift()
,
loglossBinary()
Other Calculus:
corr()
,
dist2d()
,
quants()
data(dfr) # Results for AutoML Predictions
lapply(dfr, head)
# Metrics for Binomial Model
met1 <- model_metrics(dfr$class2$tag, dfr$class2$scores,
model_name = "Titanic Survived Model",
plots = FALSE
)
print(met1)
# Metrics for Multi-Categorical Model
met2 <- model_metrics(dfr$class3$tag, dfr$class3$score,
multis = subset(dfr$class3, select = -c(tag, score)),
model_name = "Titanic Class Model",
plots = FALSE
)
print(met2)
# Metrics for Regression Model
met3 <- model_metrics(dfr$regr$tag, dfr$regr$score,
model_name = "Titanic Fare Model",
plots = FALSE
)
print(met3)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.