distillery-package: distillery: Compress arbitrary mlr3 models into keras models

Description Author(s) See Also

Description

Compresses arbitrary keras models into neural networks. Methodology similar to Caruana, 2015: Model Compression.

Author(s)

Maintainer: Jakob Bodensteiner jakob.bodensteiner@campus.lmu.de

Authors:

See Also

Useful links:


pfistfl/distillery documentation built on April 17, 2021, 10 p.m.