| xgb.ggplot.shap.summary | R Documentation |
Visualizes SHAP contributions of different features.
xgb.ggplot.shap.summary(
data,
shap_contrib = NULL,
features = NULL,
top_n = 10,
model = NULL,
trees = NULL,
target_class = NULL,
approxcontrib = FALSE,
subsample = NULL
)
xgb.plot.shap.summary(
data,
shap_contrib = NULL,
features = NULL,
top_n = 10,
model = NULL,
trees = NULL,
target_class = NULL,
approxcontrib = FALSE,
subsample = NULL
)
data |
The data to explain as a |
shap_contrib |
Matrix of SHAP contributions of |
features |
Vector of column indices or feature names to plot. When |
top_n |
How many of the most important features (<= 100) should be selected?
By default 1 for SHAP dependence and 10 for SHAP summary.
Only used when |
model |
An |
trees |
Passed to |
target_class |
Only relevant for multiclass models. The default ( |
approxcontrib |
Passed to |
subsample |
Fraction of data points randomly picked for plotting.
The default ( |
A point plot (each point representing one observation from data) is
produced for each feature, with the points plotted on the SHAP value axis.
Each point (observation) is coloured based on its feature value.
The plot allows to see which features have a negative / positive contribution on the model prediction, and whether the contribution is different for larger or smaller values of the feature. Inspired by the summary plot of https://github.com/shap/shap.
A ggplot2 object.
xgb.plot.shap(), xgb.ggplot.shap.summary(),
and the Python library https://github.com/shap/shap.
# See examples in xgb.plot.shap()
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.