Description Usage Arguments Value Author(s) References
Bayesian additive decision stump(BADS) is a Bayesian sum of two-leaf-node trees model.
1 2 3 4 |
X |
samples by features matrix |
y |
response |
x.test |
test samples by feature matrix |
sigdf, sigquant, k, lambda, sigest, sigmaf |
see ?BART::wbart |
ntree |
number of decison stumps |
nskip, ndpost |
number of burn-in and posterior draws |
Tmin |
minimum number of samples in a leaf node allowed |
printevery |
print progress for every 'printevery' iterations |
save_trees |
whether save all the trees from each iteration as a list |
rule |
The splitting rule of a node. Choices are: 1. "grp": Gaussian random projection, randomly draw a length p vector from standard normal as the linear combination coefficients of p variables; 2. sgrp: sparse Gaussian random projection, which generates sparse linear combination coefficients; 3. bart: originla bart splits, which are axis-aligned splits; 4. hyperplane: randomly connect two points from the node as the partiton of node space. |
pre_train |
whether pre-train the model using 'bart' rule before switching to another splitting rule. |
n_pre_train |
number of iterations of pre-train |
BADS returns a list of the following elements.
yhat.train |
A matrix with ndpost rows and nrow(X) columns. |
yhat.test |
A matrix with ndpost rows and nrow(x.test) columns. |
yhat.train.mean |
Posterior mean of MCMC draws of traning data fits |
yhat.test.mean |
Posterior mean of MCMC draws of testing data fits |
sigma |
draws of random error vairaince, length = nskip+ndpost |
tree_history |
If save_trees = TRUE, then a list of all trees |
Dongyue Xie: dongyxie@gmail.com
Chipman, H., George, E., and McCulloch R. (2010) Bayesian Additive Regression Trees. The Annals of Applied Statistics, 4,1, 266-298 <doi:10.1214/09-AOAS285>.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.