setup: Set's up a 'nn_module' to use with luz

View source: R/module.R

setupR Documentation

Set's up a nn_module to use with luz

Description

The setup function is used to set important attributes and method for nn_modules to be used with luz.

Usage

setup(module, loss = NULL, optimizer = NULL, metrics = NULL, backward = NULL)

Arguments

module

(nn_module) The nn_module that you want set up.

loss

(function, optional) An optional function with the signature ⁠function(input, target)⁠. It's only requires if your nn_module doesn't implement a method called loss.

optimizer

(torch_optimizer, optional) A function with the signature ⁠function(parameters, ...)⁠ that is used to initialize an optimizer given the model parameters.

metrics

(list, optional) A list of metrics to be tracked during the training procedure. Sometimes, you want some metrics to be evaluated only during training or validation, in this case you can pass a luz_metric_set() object to specify metrics used in each stage.

backward

(function) A functions that takes the loss scalar values as it's parameter. It must call ⁠$backward()⁠ or torch::autograd_backward(). In general you don't need to set this parameter unless you need to customize how luz calls the backward(), for example, if you need to add additional arguments to the backward call. Note that this becomes a method of the nn_module thus can be used by your custom step() if you override it.

Details

It makes sure the module have all the necessary ingredients in order to be fitted.

Value

A luz module that can be trained with fit().

Note

It also adds a device active field that can be used to query the current module device within methods, with eg self$device. This is useful when ctx() is not available, eg, when calling methods from outside the luz wrappers. Users can override the default by implementing a device active method in the input module.

See Also

Other training: evaluate(), fit.luz_module_generator(), predict.luz_module_fitted()


mlverse/luz documentation built on Sept. 19, 2024, 11:20 p.m.