show_best() displays the top sub-models and their performance estimates.
show_best(x, metric = NULL, n = 5, ...) select_best(x, metric = NULL, ...) select_by_pct_loss(x, ..., metric = NULL, limit = 2) select_by_one_std_err(x, ..., metric = NULL)
The results of
A character value for the metric that will be used to sort
the models. (See
more details). Not required if a single metric exists in
An integer for the number of top results/rows to return.
The limit of loss of performance that is acceptable (in percent units). See details below.
select_best() finds the tuning parameter combination with the best
select_by_one_std_err() uses the "one-standard error rule" (Breiman _el
at, 1984) that selects the most simple model that is within one standard
error of the numerically optimal results.
select_by_pct_loss() selects the most simple model whose loss of
performance is within some acceptable limit.
For percent loss, suppose the best model has an RMSE of 0.75 and a simpler
model has an RMSE of 1. The percent loss would be
(1.00 - 0.75)/1.00 * 100,
or 25 percent. Note that loss will always be non-negative.
A tibble with columns for the parameters.
includes columns for performance metrics.
Breiman, Leo; Friedman, J. H.; Olshen, R. A.; Stone, C. J. (1984). Classification and Regression Trees. Monterey, CA: Wadsworth.
data("example_ames_knn") show_best(ames_iter_search, metric = "rmse") select_best(ames_iter_search, metric = "rsq") # To find the least complex model within one std error of the numerically # optimal model, the number of nearest neighbors are sorted from the largest # number of neighbors (the least complex class boundary) to the smallest # (corresponding to the most complex model). select_by_one_std_err(ames_grid_search, metric = "rmse", desc(K)) # Now find the least complex model that has no more than a 5% loss of RMSE: select_by_pct_loss( ames_grid_search, metric = "rmse", limit = 5, desc(K) )
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.