Shapley | R Documentation |
Shapley
computes feature contributions for single predictions with the
Shapley value, an approach from cooperative game theory. The features values
of an instance cooperate to achieve the prediction. The Shapley value fairly
distributes the difference of the instance's prediction and the datasets
average prediction among the features.
For more details on the algorithm see https://christophm.github.io/interpretable-ml-book/shapley.html
iml::InterpretationMethod
-> Shapley
x.interest
data.frame
Single row with the instance to be explained.
y.hat.interest
numeric
Predicted value for instance of interest.
y.hat.average
numeric(1)
Average predicted value for data X
.
sample.size
numeric(1)
The number of times coalitions/marginals are
sampled from data X. The higher the more accurate the explanations
become.
new()
Create a Shapley object
Shapley$new(predictor, x.interest = NULL, sample.size = 100)
predictor
Predictor
The object (created with Predictor$new()
) holding the machine
learning model and the data.
x.interest
data.frame
Single row with the instance to be explained.
sample.size
numeric(1)
The number of Monte Carlo samples for estimating the Shapley value.
data.frame
data.frame with the Shapley values (phi) per feature.
explain()
Set a new data point which to explain.
Shapley$explain(x.interest)
x.interest
data.frame
Single row with the instance to be explained.
clone()
The objects of this class are cloneable with this method.
Shapley$clone(deep = FALSE)
deep
Whether to make a deep clone.
Strumbelj, E., Kononenko, I. (2014). Explaining prediction models and individual predictions with feature contributions. Knowledge and Information Systems, 41(3), 647-665. https://doi.org/10.1007/s10115-013-0679-x
Shapley
A different way to explain predictions: LocalModel
library("rpart")
# First we fit a machine learning model on the Boston housing data
data("Boston", package = "MASS")
rf <- rpart(medv ~ ., data = Boston)
X <- Boston[-which(names(Boston) == "medv")]
mod <- Predictor$new(rf, data = X)
# Then we explain the first instance of the dataset with the Shapley method:
x.interest <- X[1, ]
shapley <- Shapley$new(mod, x.interest = x.interest)
shapley
# Look at the results in a table
shapley$results
# Or as a plot
plot(shapley)
# Explain another instance
shapley$explain(X[2, ])
plot(shapley)
## Not run:
# Shapley() also works with multiclass classification
rf <- rpart(Species ~ ., data = iris)
X <- iris[-which(names(iris) == "Species")]
mod <- Predictor$new(rf, data = X, type = "prob")
# Then we explain the first instance of the dataset with the Shapley() method:
shapley <- Shapley$new(mod, x.interest = X[1, ])
shapley$results
plot(shapley)
# You can also focus on one class
mod <- Predictor$new(rf, data = X, type = "prob", class = "setosa")
shapley <- Shapley$new(mod, x.interest = X[1, ])
shapley$results
plot(shapley)
## End(Not run)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.