sits_tuning: Tuning machine learning models hyper-parameters

View source: R/sits_tuning.R

sits_tuningR Documentation

Tuning machine learning models hyper-parameters

Description

Machine learning models use stochastic gradient descent (SGD) techniques to find optimal solutions. To perform SGD, models use optimization algorithms which have hyperparameters that have to be adjusted to achieve best performance for each application.

This function performs a random search on values of selected hyperparameters. Instead of performing an exhaustive test of all parameter combinations, it selecting them randomly. Validation is done using an independent set of samples or by a validation split. The function returns the best hyper-parameters in a list. Hyper-parameters passed to params parameter should be passed by calling sits_tuning_hparams().

Usage

sits_tuning(
  samples,
  samples_validation = NULL,
  validation_split = 0.2,
  ml_method = sits_tempcnn(),
  params = sits_tuning_hparams(optimizer = torch::optim_adamw, opt_hparams = list(lr =
    loguniform(10^-2, 10^-4))),
  trials = 30,
  multicores = 2,
  progress = FALSE
)

Arguments

samples

Time series set to be validated.

samples_validation

Time series set used for validation.

validation_split

Percent of original time series set to be used for validation (if samples_validation is NULL)

ml_method

Machine learning method.

params

List with hyper parameters to be passed to ml_method. User can use uniform, choice, randint, normal, lognormal, loguniform, and beta distribution functions to randomize parameters.

trials

Number of random trials to perform the random search.

multicores

Number of cores to process in parallel.

progress

Show progress bar?

Value

A tibble containing all parameters used to train on each trial ordered by accuracy

Author(s)

Rolf Simoes, rolf.simoes@inpe.br

References

James Bergstra, Yoshua Bengio, "Random Search for Hyper-Parameter Optimization". Journal of Machine Learning Research. 13: 281–305, 2012.

Examples

if (sits_run_examples()) {
    # find best learning rate parameters for TempCNN
    tuned <- sits_tuning(
        samples_modis_ndvi,
        ml_method = sits_tempcnn(),
        params = sits_tuning_hparams(
            optimizer = choice(
                torch::optim_adamw
            ),
            opt_hparams = list(
                lr = loguniform(10^-2, 10^-4)
            )
        ),
        trials = 4,
        multicores = 2,
        progress = FALSE
    )
    # obtain best accuracy, kappa and best_lr
    accuracy <- tuned$accuracy[[1]]
    kappa <- tuned$kappa[[1]]
    best_lr <- tuned$opt_hparams[[1]]$lr
}


sits documentation built on Sept. 11, 2024, 6:36 p.m.