Description Usage Arguments Value Author(s) References See Also Examples
This function provides a set of descriptive statistics for each evaluation metric that is estimated on a performance estimation comparison. These statistics are obtained for a particular workflow, and for one of the prediction tasks involved in the experiment.
1 | estimationSummary(results,workflow,task)
|
results |
This is a |
workflow |
A string with the ID of a workflow (it can also be an integer). |
task |
A string with the ID of a task (it can also be an integer). |
The function returns a matrix with the rows representing summary statistics of the scores obtained by the model on the different iterations, and the columns representing the evaluation statistics estimated in the experiment.
Luis Torgo ltorgo@dcc.fc.up.pt
Torgo, L. (2014) An Infra-Structure for Performance Estimation and Experimental Comparison of Predictive Models in R. arXiv:1412.0436 [cs.MS] http://arxiv.org/abs/1412.0436
getScores
, performanceEstimation
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 | ## Not run:
## Estimating MSE for 3 variants of both
## regression trees and SVMs, on two data sets, using one repetition
## of 10-fold CV
library(e1071)
data(swiss)
## running the estimation experiment
res <- performanceEstimation(
PredTask(Infant.Mortality ~ .,swiss),
workflowVariants(learner="svm",
learner.pars=list(cost=c(1,10),gamma=c(0.01,0.5))),
EstimationTask("mse",method=CV(nReps=2,nFolds=5))
)
## Get the summary of the estimations of svm.v2 on swiss
estimationSummary(res,"svm.v2","swiss.Infant.Mortality")
## End(Not run)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.