evaluate: Evaluating prediction/modeling quality

Description Usage Arguments Value Author(s) See Also Examples

View source: R/evaluating.r

Description

evaluate is a generic function for evaluating the quality of time series prediction or modeling fitness based on a particular metric defined in an evaluating object. The function invokes particular methods which depend on the class of the first argument.

Usage

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
evaluate(obj, ...)

## S3 method for class 'evaluating'
evaluate(obj, test, pred, ...)

## S3 method for class 'fitness'
evaluate(obj, mdl, test = NULL, pred = NULL, ...)

## S3 method for class 'error'
evaluate(obj, mdl = NULL, test = NULL, pred = NULL, ..., fitness = FALSE)

Arguments

obj

An object of class evaluating defining a particular metric.

...

Other parameters passed to eval_func of obj.

test

A vector or univariate time series containing actual values for a time series that are to be compared against pred.

pred

A vector or univariate time series containing time series predictions that are to be compared against the values in test.

mdl

A time series model object for which fitness is to be evaluated.

fitness

Should the function compute the fitness quality? If TRUE the function uses mdl to compute fitness error, otherwise, it uses test and pred to compute prediction error.

For evaluate.fitness, test and pred are ignored and can be set to NULL. For evaluate.error, mdl is ignored if fitness is FALSE, otherwise, test and pred are ignored and can be set to NULL.

Value

A list containing obj and the computed metric values.

Author(s)

Rebecca Pontes Salles

See Also

Other evaluate: evaluate.tspred()

Examples

1
2
3
4
5
6
7
data(CATS,CATS.cont)
mdl <- forecast::auto.arima(CATS[,1])
pred <- forecast::forecast(mdl, h=length(CATS.cont[,1]))

evaluate(MSE_eval(), test=CATS.cont[,1], pred=pred$mean)
evaluate(MSE_eval(), mdl, fitness=TRUE)
evaluate(AIC_eval(), mdl)

TSPred documentation built on Jan. 21, 2021, 5:10 p.m.