Description Usage Arguments Details Examples
This function can be used to speed up parameter tuning when it is reasonable to assume that the loss function is convex with respect to it's parameters, taken one at a time.
1 2 | multi_convex_accept(params_df, scores, candidate_param, convex_params = NULL,
patience = 1)
|
params_df |
Combinations of parameters previously encountered. |
scores |
Corresponding score values previously encountered. |
candidate_param |
New parameter combination to accept/reject. |
convex_params |
Names of the parameters with regards to which the loss function should be considered convex. If NULL, will choose all numeric params. |
patience |
After how many increases in target value should we reject a parameter value ? |
Most machine learning algorithms have parameter(s) that control complexity, e.g. number of layers and number of hidden nodes for a neural network, max_depth for boosted trees, etc. Usually, the loss function is convex with respect to these parameters, i.e. given all the other parameters, it first decreases as the complexity parameter is increased, reaches a minimum, then starts increasing. This function avoids exploring areas of the parameters beyond the minimum of the function.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 | library(dplyr)
optim_path_df <-
tribble(
~x1, ~x2, ~y,
4, 1, 10,
4, 2, 9 ,
4, 3, 8 ,
4, 4, 8.5,
5, 1, 11,
5, 2, 10,
5, 3, 7
)
target_var <- 'y'
candidate_param <- c(x1 = 6, x2 = 2)
multi_convex_accept(optim_path_df, target_var, candidate_param)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.