| GANLearner_from_learners | R Documentation | 
Create a GAN from 'learn_gen' and 'learn_crit'.
GANLearner_from_learners(
  gen_learn,
  crit_learn,
  switcher = NULL,
  weights_gen = NULL,
  gen_first = FALSE,
  switch_eval = TRUE,
  show_img = TRUE,
  clip = NULL,
  cbs = NULL,
  metrics = NULL,
  loss_func = NULL,
  opt_func = Adam(),
  lr = 0.001,
  splitter = trainable_params(),
  path = NULL,
  model_dir = "models",
  wd = NULL,
  wd_bn_bias = FALSE,
  train_bn = TRUE,
  moms = list(0.95, 0.85, 0.95)
)
| gen_learn | generator learner | 
| crit_learn | discriminator learner | 
| switcher | switcher | 
| weights_gen | weights generator | 
| gen_first | generator first | 
| switch_eval | switch evaluation | 
| show_img | show image or not | 
| clip | clip value | 
| cbs | Cbs is one or a list of Callbacks to pass to the Learner. | 
| metrics | It is an optional list of metrics, that can be either functions or Metrics. | 
| loss_func | loss function | 
| opt_func | The function used to create the optimizer | 
| lr | learning rate | 
| splitter | It is a function that takes self.model and returns a list of parameter groups (or just one parameter group if there are no different parameter groups). | 
| path | The folder where to work | 
| model_dir | Path and model_dir are used to save and/or load models. | 
| wd | It is the default weight decay used when training the model. | 
| wd_bn_bias | It controls if weight decay is applied to BatchNorm layers and bias. | 
| train_bn | It controls if BatchNorm layers are trained even when they are supposed to be frozen according to the splitter. | 
| moms | The default momentums used in Learner$fit_one_cycle. | 
None
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.