confusion: Confusion calculation

Description Usage Arguments Details Value Examples

Description

First test labels are matched to gold standard labels by using minWeightBipartiteMatching function from this package. Then when labels of labs and labs.known are of the same order, confusions between them are calculated. Confusion metrics introduced by myself:

Usage

1
confusion(labs, labs.known)

Arguments

labs

Cluster labels from test clustering

labs.known

Cluster labels from gold standard clustering

Details

1 - number-of-matching-labels/total-number-of-labels

where number-of-matching-labels - number of labels in each cluster group of labs that match to the labels of labs.known for the same cluster group; and total-number-of-labels - total number of labels of labs.known from the same cluster group.

Value

Vector containing confusions between labs and labs.known for each cluster group

Examples

1
res <- confusion(labs, labs.known)

wikiselev/clustools documentation built on May 4, 2019, 5:25 a.m.