View source: R/classification_report.R
classification_report | R Documentation |
Calculates the confusion matrix and several performance metrics.
classification_report(
y_true,
y_pred
)
y_true |
A vector with the true labels. |
y_pred |
A vector with the predicted labels. |
Returns a list with the following entries:
metrics |
A table with the precision, recall and f1-score for each class. |
confusion_matrix |
The confusion matrix. |
accuracy |
The accuracy. |
mutual_information |
The mutual information between the true and the predicted classes. |
#Example 1
X <- iris[,1:4]
y <- iris$Species
model <- copulaClassifier(X = X, y = y, copula = "frank",
distribution = "kernel", graph_model = "tree")
y_pred <- copulaPredict(X = X, model = model)
classification_report(y_true = y, y_pred = y_pred$class)
#Example 2
X <- iris[,1:4]
y <- iris$Species
model <- copulaClassifier(X = X, y = y, copula = c("frank","clayton"),
distribution = "kernel", graph_model = "chain")
y_pred <- copulaPredict(X = X, model = model)
classification_report(y_true = y, y_pred = y_pred$class)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.