tuneParams | R Documentation |
Optimizes the hyperparameters of a learner. Allows for different optimization methods, such as grid search, evolutionary strategies, iterated F-race, etc. You can select such an algorithm (and its settings) by passing a corresponding control object. For a complete list of implemented algorithms look at TuneControl.
Multi-criteria tuning can be done with tuneParamsMultiCrit.
tuneParams(
learner,
task,
resampling,
measures,
par.set,
control,
show.info = getMlrOption("show.info"),
resample.fun = resample
)
learner |
(Learner | |
task |
(Task) |
resampling |
(ResampleInstance | ResampleDesc) |
measures |
(list of Measure | Measure) |
par.set |
(ParamHelpers::ParamSet) |
control |
(TuneControl) |
show.info |
( |
resample.fun |
(closure) |
(TuneResult).
If you would like to include results from the training data set, make sure to appropriately adjust the resampling strategy and the aggregation for the measure. See example code below.
generateHyperParsEffectData
Other tune:
TuneControl
,
getNestedTuneResultsOptPathDf()
,
getNestedTuneResultsX()
,
getResamplingIndices()
,
getTuneResult()
,
makeModelMultiplexerParamSet()
,
makeModelMultiplexer()
,
makeTuneControlCMAES()
,
makeTuneControlDesign()
,
makeTuneControlGenSA()
,
makeTuneControlGrid()
,
makeTuneControlIrace()
,
makeTuneControlMBO()
,
makeTuneControlRandom()
,
makeTuneWrapper()
,
tuneThreshold()
set.seed(123)
# a grid search for an SVM (with a tiny number of points...)
# note how easily we can optimize on a log-scale
ps = makeParamSet(
makeNumericParam("C", lower = -12, upper = 12, trafo = function(x) 2^x),
makeNumericParam("sigma", lower = -12, upper = 12, trafo = function(x) 2^x)
)
ctrl = makeTuneControlGrid(resolution = 2L)
rdesc = makeResampleDesc("CV", iters = 2L)
res = tuneParams("classif.ksvm", iris.task, rdesc, par.set = ps, control = ctrl)
print(res)
# access data for all evaluated points
df = as.data.frame(res$opt.path)
df1 = as.data.frame(res$opt.path, trafo = TRUE)
print(head(df[, -ncol(df)]))
print(head(df1[, -ncol(df)]))
# access data for all evaluated points - alternative
df2 = generateHyperParsEffectData(res)
df3 = generateHyperParsEffectData(res, trafo = TRUE)
print(head(df2$data[, -ncol(df2$data)]))
print(head(df3$data[, -ncol(df3$data)]))
## Not run:
# we optimize the SVM over 3 kernels simultanously
# note how we use dependent params (requires = ...) and iterated F-racing here
ps = makeParamSet(
makeNumericParam("C", lower = -12, upper = 12, trafo = function(x) 2^x),
makeDiscreteParam("kernel", values = c("vanilladot", "polydot", "rbfdot")),
makeNumericParam("sigma", lower = -12, upper = 12, trafo = function(x) 2^x,
requires = quote(kernel == "rbfdot")),
makeIntegerParam("degree", lower = 2L, upper = 5L,
requires = quote(kernel == "polydot"))
)
print(ps)
ctrl = makeTuneControlIrace(maxExperiments = 5, nbIterations = 1, minNbSurvival = 1)
rdesc = makeResampleDesc("Holdout")
res = tuneParams("classif.ksvm", iris.task, rdesc, par.set = ps, control = ctrl)
print(res)
df = as.data.frame(res$opt.path)
print(head(df[, -ncol(df)]))
# include the training set performance as well
rdesc = makeResampleDesc("Holdout", predict = "both")
res = tuneParams("classif.ksvm", iris.task, rdesc, par.set = ps,
control = ctrl, measures = list(mmce, setAggregation(mmce, train.mean)))
print(res)
df2 = as.data.frame(res$opt.path)
print(head(df2[, -ncol(df2)]))
## End(Not run)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.