pfistfl/distillery: Compress arbitrary mlr3 models into keras models

Compresses arbitrary keras models into neural networks. Methodology similar to Caruana, 2015: Model Compression.

Getting started

Package details

MaintainerFlorian Pfisterer <florian.pfisterer@stat.uni-muenchen.de>
LicenseGPL-3
Version0.0.0.9000
URL https://mlr3misc.mlr-org.com https://github.com/mlr-org/mlr3misc
Package repositoryView on GitHub
Installation Install the latest version of this package by entering the following in R:
install.packages("remotes")
remotes::install_github("pfistfl/distillery")
pfistfl/distillery documentation built on April 17, 2021, 10 p.m.