class_check | R Documentation |
The function computes the confusion matrix between the logical output of an outlier detection algorithm and a reference (ground-truth) logical vector. The function also calculates the overal accuracy of the results from the confusion matrix, including recall, precision, and F1-scores for the two classes (regular, versus outlier).
class_check(pred, truth)
pred |
A logical vector with the classification output from an anomaly detection algorithm. |
truth |
A logical vector with the observed classification as a reference (or ground truth). |
The function computes the confusion matrix using the function table
. True positive and false negative are successively evaluated to compute overall accuracy, recall, precision, and F1-scores.
An S3 class named checkwise
with the confusion matrix, and other accuracy metrices appended as attribues.
attr(, "overall")
A numeric value between zero and one with the overall accuracy.
attr(, "recall")
A numeric vector of values between zero and one with the recall index for regular and outlier cells.
attr(, "precision")
A numeric vector of values between zero and one with the precision index for regular and outlier cells.
attr(, "f1-score")
A numeric vector of values between zero and one with the F1-scores for regular and outlier cells.
Luca Sartore drwolf85@gmail.com
# Load the package
library(HRTnomaly)
set.seed(2025L)
# Load the 'toy' data
data(toy)
# Detect cellwise outliers using Bayesian Analysis
res <- cellwise(toy[sample.int(100), ], 0.5, 10L)
class_check(res$outlier, res$anomaly_flag != "")
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.