Description Usage Arguments Examples
Hyperband algorithm for Hyperparameter Optimization
1 |
FUN |
The function to be optimized. This Function should return a numeric value of validation performance, and the first argument of this function must indicate the resource. |
maximize |
When it is TRUE, it means the larger the evaluation score the better. |
bounds |
A named list of lower and upper bounds for each hyperparameter. The names of the list should be identical to the rest arguments of FUN. |
R |
Resource parameter, the maximum amount of resource that can be allocated to a single hyperparameter configuration. |
R_unit |
Resource unit, the minimum amount of computation where different hyperparameter configurations start to separate. The user can set unit as integer to force integer number of resources being allocated. |
eta |
an input that controls the proportion of configurations discarded in each round of SuccessiveHalving. |
verbose |
boolean, print the statistics during the process |
parallel |
boolean, if TRUE, the for inner loop is parallized using the foreach package. |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 | # Example 1: Optimization
Test_Fun <- function(r, x) {
exp(-(x - 2)^2) + exp(-(x - 6)^2/10) + 1/ (x^2 + 1) + r * 10e-10
}
OPT_Res <- Hyperband(Test_Fun, maximize = TRUE, bounds = list(x = c(-50, +50)),
R = 81L, R_unit = 10L, eta = 3, verbose = TRUE)
## Not run:
# Example 2: Parameter Tuning
library(xgboost)
data(agaricus.train, package = 'xgboost')
dtrain <- xgb.DMatrix(agaricus.train$data,
label = agaricus.train$label)
XGB_CV_FUN <- function(nrounds, lambda, lambda_bias, alpha) {
XGB_CV <- xgb.cv(params = list(booster = "gblinear", eta = 0.1,
lambda = lambda, lambda_bias = lambda_bias, alpha = alpha,
objective = "binary:logistic", eval_metric = "logloss"),
data = dtrain, nrounds = nrounds, nfold = 5, verbose = 1,
callbacks = list(cb.early.stop(stopping_rounds = 10,
maximize = FALSE,
metric_name = "test-logloss"),
cb.cv.predict(save_models = FALSE)))
min(XGB_CV$evaluation_log$test_logloss_mean)
}
OPT_Res <- Hyperband(XGB_CV_FUN, maximize = FALSE,
bounds = list(lambda = c(0, 5),lambda_bias = c(0L, 10L),alpha = c(0, 5)),
R = 1000L, R_unit = 1L, eta = 3, verbose = TRUE)
## End(Not run)
## Not run:
# Example 3: parallel version
library(doParallel)
cl <- makeCluster(2)
registerDoParallel(cl)
Test_Fun <- function(r, x) {
exp(-(x - 2)^2) + exp(-(x - 6)^2/10) + 1/ (x^2 + 1) + r * 10e-10
}
foo <- Hyperband(Test_Fun, maximize = TRUE, bounds = list(x = c(-50, +50)),
R = 81L, R_unit = 10L, eta = 3, verbose = TRUE,
parallel=TRUE)
stopCluster(cl)
## End(Not run)
{
Lisha Li, Kevin Jamieson, Giulia DeSalvo, Afshin Rostamizadeh, Ameet Talwalkar (2016) Hyperband: A Novel Bandit-Based Approach to Hyperparameter Optimization
}
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.