evaluation_metrics: Evaluation metrics for model predictions

View source: R/utils.R

evaluation_metricsR Documentation

Evaluation metrics for model predictions

Description

Computes a set of performance metrics (e.g., AUC, TSS, CBI) based on observed and predicted values.

Usage

evaluation_metrics(df, na.rm = TRUE, method = "spearman")

Arguments

df

A data.frame with columns: 'observed' (0/1), 'predicted' (0/1), 'probability' (numeric).

na.rm

Logical. Whether to remove rows with NA values.

method

Correlation method for CBI ("spearman", "pearson", or "kendall").

Value

A named list or data.frame with evaluation metrics.


glossa documentation built on June 8, 2025, 1:20 p.m.