mlr3fairness: Fairness Auditing and Debiasing for 'mlr3'

Integrates fairness auditing and bias mitigation methods for the 'mlr3' ecosystem. This includes fairness metrics, reporting tools, visualizations and bias mitigation techniques such as "Reweighing" described in 'Kamiran, Calders' (2012) <doi:10.1007/s10115-011-0463-8> and "Equalized Odds" described in 'Hardt et al.' (2016) <https://papers.nips.cc/paper/2016/file/9d2682367c3935defcb1f9e247a97c0d-Paper.pdf>. Integration with 'mlr3' allows for auditing of ML models as well as convenient joint tuning of machine learning algorithms and debiasing methods.

Package details

AuthorFlorian Pfisterer [cre, aut] (<https://orcid.org/0000-0001-8867-762X>), Wei Siyi [aut], Michel Lang [aut] (<https://orcid.org/0000-0001-9754-0393>)
MaintainerFlorian Pfisterer <pfistererf@googlemail.com>
LicenseLGPL-3
Version0.3.2
URL https://mlr3fairness.mlr-org.com https://github.com/mlr-org/mlr3fairness
Package repositoryView on CRAN
Installation Install the latest version of this package by entering the following in R:
install.packages("mlr3fairness")

Try the mlr3fairness package in your browser

Any scripts or data that you put into this service are public.

mlr3fairness documentation built on May 31, 2023, 7:22 p.m.