po("nn_identity")
LearnerTorchModule
for easily creating torch learners from torch modules.TorchIngressToken
now also can take a Selector
as argument features
.po("nn_fn")
for calling custom functions in a network.po("nn_ft_cls")
for concatenating a CLS token to a tokenized input.nn("head")
was also changed to match this.
This means that for binary classification tasks, t_loss("cross_entropy")
now generates
nn_bce_with_logits_loss
instead of nn_cross_entropy_loss
.
This also came with a reparametrization of the t_loss("cross_entropy")
loss (thanks to @tdhock, #374).LearnerTorchModel
can now be parallelized and trained with
encapsulation activated.jit_trace
now works in combination with batch normalization.R6
version 2.6.0LearnerTorch$.dataloader()
method now operates no longer
on the task
but on the dataset
generated by the private LearnerTorch$.dataset()
method.shuffle
parameter during model training is now initialized to TRUE
to sidestep
issues where data is sorted.jit_trace
parameter was added to LearnerTorch
, which when set to
TRUE
can lead to significant speedups.
This should only be enabled for 'static' models, see the
torch tutorial
for more information.num_interop_threads
to LearnerTorch
.tensor_dataset
parameter was added, which allows to stack all batches
at the beginning of training to make loading of batches afterwards faster.PipeOp
for adaptive average pooling.n_layers
parameter was added to the MLP learner.AutoTuner
.epochs - patience
for the internally tuned
values instead of the trained number of epochs
as it was before.dataset
of a learner must no longer return the tensors on the specified device
,
which allows for parallel dataloading on GPUs.PipeOpBlock
should no longer create ID clashes with other PipeOps in the graph (#260).data_formats
anymoreCallbackSetTB
, which allows logging that can be viewed by TensorBoard.PipeOps
such as po("trafo_resize")
which failed in some cases.LearnerTabResnet
now works correctlynn()
helper function to simplify the creation of neural network
layersAdd the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.