View source: R/metric_scores.R
metric_scores | R Documentation |
Creates metric_scores
object to facilitate visualization. Check how the metric scores differ among models, what is this score, and how it changes
for example after applying bias mitigation technique. The vertical black lines
denote the scores for privileged subgroup. It is best to use only few metrics (using fairness_metrics
parameter)
metric_scores(x, fairness_metrics = c("ACC", "TPR", "PPV", "FPR", "STP"))
x |
object of class |
fairness_metrics |
character, vector with fairness metric names. Default metrics are ones in |
metric_scores
object.
It is a list containing:
metric_scores_data - data.frame
with information about score in particular subgroup, metric, and model
privileged - name of privileged subgroup
data("german") y_numeric <- as.numeric(german$Risk) - 1 lm_model <- glm(Risk ~ ., data = german, family = binomial(link = "logit") ) explainer_lm <- DALEX::explain(lm_model, data = german[, -1], y = y_numeric) fobject <- fairness_check(explainer_lm, protected = german$Sex, privileged = "male" ) ms <- metric_scores(fobject, fairness_metrics = c("ACC", "TPR", "PPV", "FPR", "STP")) plot(ms) rf_model <- ranger::ranger(Risk ~ ., data = german, probability = TRUE, num.trees = 200 ) explainer_rf <- DALEX::explain(rf_model, data = german[, -1], y = y_numeric) fobject <- fairness_check(explainer_rf, fobject) ms <- metric_scores(fobject, fairness_metrics = c("ACC", "TPR", "PPV", "FPR", "STP")) plot(ms)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.