estimateMetrics: Estimate performance metrics in SummarizedBenchmark object

Description Usage Arguments Value Functions Author(s) See Also Examples

Description

These functions estimate the performance metrics, either passed as arguments or added previously with the addPerformanceMetric function. The function will estimate the performance metric for each method.

Usage

1
2
3
4
5
estimateMetricsForAssay(object, assay, evalMetric = NULL,
  addColData = FALSE, evalFunction = NULL, tidy = FALSE, ...)

estimatePerformanceMetrics(object, addColData = FALSE, tidy = FALSE,
  rerun = TRUE, ...)

Arguments

object

A SummarizedBenchmark object.

assay

A string with an assay name. Indicates the assay that should be given as input to this performance metric.

evalMetric

A string with the name of the evaluation metric.

addColData

Logical (default: FALSE). If TRUE, the results are added to the colData slot of the SummarizedExperiment object and the object is returned. If FALSE, only a DataFrame with the results is returned.

evalFunction

A function that calculates a performance metric. It should contain at least two arguments, query and truth, where query is the output vector of a method and truth is the vector of ground true values. If additional parameters are specified, they must contain default values. If this parameter is passed, the metrics in the object are ignored and only this evaluation metric is estimated.

tidy

Logical (default: FALSE). If TRUE, a long formated data.frame is returned.

...

Additional parameters passed to the performance functions.

rerun

Logical (default: TRUE). By default, all performance metrics are recalculated everytime that estimatePerformanceMetrics is called. If FALSE, performance metrics will only be calculated for newly added methods or modified methods.

Value

Either a SummarizedBenchmark object, a DataFrame or a data.frame.

Functions

Author(s)

Alejandro Reyes

See Also

availableMetrics, performanceMetrics

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
data( sb )
sb <- addPerformanceMetric(
   object=sb,
   assay="qvalue",
   evalMetric="TPR",
   evalFunction = function( query, truth, alpha=0.1 ){
       goodHits <- sum( (query < alpha) & truth == 1 )
       goodHits / sum(truth == 1)
   }
)

qvalueMetrics <- estimateMetricsForAssay( sb, assay="qvalue" )
allMetrics <- estimatePerformanceMetrics( sb )
allMetricsTidy <- estimatePerformanceMetrics( sb, tidy=TRUE )

SummarizedBenchmark documentation built on Nov. 8, 2020, 8:30 p.m.