distillML: Model Distillation and Interpretability Methods for Machine Learning Models

Provides several methods for model distillation and interpretability for general black box machine learning models and treatment effect estimation methods. For details on the algorithms implemented, see <https://forestry-labs.github.io/distillML/index.html> Brian Cho, Theo F. Saarinen, Jasjeet S. Sekhon, Simon Walter.

Version: 0.1.0.13
Imports: ggplot2, glmnet, Rforestry, dplyr, R6 (≥ 2.0), checkmate, purrr, tidyr, data.table, mltools, gridExtra
Suggests: testthat, knitr, rmarkdown, mvtnorm
Published: 2023-03-25
DOI: 10.32614/CRAN.package.distillML
Author: Brian Cho [aut], Theo Saarinen [aut, cre], Jasjeet Sekhon [aut], Simon Walter [aut]
Maintainer: Theo Saarinen <theo_s at berkeley.edu>
BugReports: https://github.com/forestry-labs/distillML/issues
License: GPL (≥ 3)
URL: https://github.com/forestry-labs/distillML
NeedsCompilation: no
Materials: README
CRAN checks: distillML results

Documentation:

Reference manual: distillML.pdf

Downloads:

Package source: distillML_0.1.0.13.tar.gz
Windows binaries: r-devel: distillML_0.1.0.13.zip, r-release: distillML_0.1.0.13.zip, r-oldrel: distillML_0.1.0.13.zip
macOS binaries: r-release (arm64): distillML_0.1.0.13.tgz, r-oldrel (arm64): distillML_0.1.0.13.tgz, r-release (x86_64): distillML_0.1.0.13.tgz, r-oldrel (x86_64): distillML_0.1.0.13.tgz
Old sources: distillML archive

Linking:

Please use the canonical form https://CRAN.R-project.org/package=distillML to link to this page.