distillML: Model Distillation and Interpretability Methods for Machine Learning Models

Provides several methods for model distillation and interpretability for general black box machine learning models and treatment effect estimation methods. For details on the algorithms implemented, see <https://forestry-labs.github.io/distillML/index.html> Brian Cho, Theo F. Saarinen, Jasjeet S. Sekhon, Simon Walter.

Getting started

Package details

AuthorBrian Cho [aut], Theo Saarinen [aut, cre], Jasjeet Sekhon [aut], Simon Walter [aut]
MaintainerTheo Saarinen <theo_s@berkeley.edu>
LicenseGPL (>= 3)
Version0.1.0.13
URL https://github.com/forestry-labs/distillML
Package repositoryView on CRAN
Installation Install the latest version of this package by entering the following in R:
install.packages("distillML")

Try the distillML package in your browser

Any scripts or data that you put into this service are public.

distillML documentation built on March 31, 2023, 8 p.m.