View source: R/bart_package_plots.R
investigate_var_importance | R Documentation |
Explore the variable inclusion proportions for a BART model to learn about the relative influence of the different covariates. This function includes an option to generate a plot of the variable inclusion proportions.
investigate_var_importance(bart_machine, type = "splits",
plot = TRUE, num_replicates_for_avg = 5, num_trees_bottleneck = 20,
num_var_plot = Inf, bottom_margin = 10)
bart_machine |
An object of class “bartMachine”. |
type |
If “splits”, then the proportion of times each variable is chosen for a splitting rule is computed. If “trees”, then the proportion of times each variable appears in a tree is computed. |
plot |
If TRUE, a plot of the variable inclusion proportions is generated. |
num_replicates_for_avg |
The number of replicates of BART to be used to generate variable inclusion proportions. Averaging across multiple BART models improves stability of the estimates. See Bleich et al. (2013) for more details. |
num_trees_bottleneck |
Number of trees to be used in the sum-of-trees for computing the variable inclusion proportions. A small number of trees should be used to force the variables to compete for entry into the model. Chipman et al. (2010) recommend 20. See this reference for more details. |
num_var_plot |
Number of variables to be shown on the plot. If “Inf”, all variables are plotted. |
bottom_margin |
A display parameter that adjusts the bottom margin of the graph if labels are clipped. The scale of this parameter is the same as set with |
In the plot, the red bars correspond to the standard error of the variable inclusion proportion estimates.
Invisibly, returns a list with the following components:
avg_var_props |
The average variable inclusion proportions for each variable |
sd_var_props |
The standard deviation of the variable inclusion proportions for each variable (across |
This function is parallelized by the number of cores set in set_bart_machine_num_cores
.
Adam Kapelner and Justin Bleich
Adam Kapelner, Justin Bleich (2016). bartMachine: Machine Learning with Bayesian Additive Regression Trees. Journal of Statistical Software, 70(4), 1-40. doi:10.18637/jss.v070.i04
J Bleich, A Kapelner, ST Jensen, and EI George. Variable Selection Inference for Bayesian Additive Regression Trees. ArXiv e-prints, 2013.
HA Chipman, EI George, and RE McCulloch. BART: Bayesian Additive Regressive Trees. The Annals of Applied Statistics, 4(1): 266–298, 2010.
interaction_investigator
## Not run:
#generate Friedman data
set.seed(11)
n = 200
p = 10
X = data.frame(matrix(runif(n * p), ncol = p))
y = 10 * sin(pi* X[ ,1] * X[,2]) +20 * (X[,3] -.5)^2 + 10 * X[ ,4] + 5 * X[,5] + rnorm(n)
##build BART regression model
bart_machine = bartMachine(X, y, num_trees = 20)
#investigate variable inclusion proportions
investigate_var_importance(bart_machine)
## End(Not run)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.