Nothing
autoplot.tabnet_fit()
(#67)tabnet_pretrain()
now allows missing values in predictors. (#68)tabnet_explain()
now works for tabnet_pretrain
models. (#68)random_obfuscator()
torch_nn module. (#68)tabnet_fit()
and predict()
now allow missing values in predictors. (#76)tabnet_config()
now supports a num_workers=
parameters to control parallel dataloading (#83)tabnet_config()
now has a flag skip_importance
to skip calculating feature importance (@egillax, #91)tabnet_nn
min_grid.tabnet
method for tune
(@cphaarmeyer, #107)tabnet_explain()
method for parsnip models (@cphaarmeyer, #108)tabnet_fit()
and predict()
now allow multi-outcome, all numeric or all factors but not mixed. (#118)tabnet_explain()
is now correctly handling missing values in predictors. (#77)dataloader
can now use num_workers>0
(#83)batch_size
and virtual_batch_size
improves performance on mid-range devices.engine="torch"
to tabnet parsnip model (#114)autoplot()
warnings turned into errors with {ggplot2} v3.4 (#113)update
method for tabnet models to allow the correct usage of finalize_workflow
(#60).tabnet_fit()
(@cregouby, #26)tabnet_explain()
.tabnet_pretrain()
for unsupervised pretraining (@cregouby, #29)autoplot()
of model loss among epochs (@cregouby, #36)config
argument to fit() / pretrain()
so one can pass a pre-made config list. (#42)tabnet_config()
, new mask_type
option with entmax
additional to default sparsemax
(@cmcmaster1, #48)tabnet_config()
, loss
now also takes function (@cregouby, #55)NEWS.md
file to track changes to the package.Any scripts or data that you put into this service are public.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.