View source: R/ML_BARTMachineModel.R
BARTMachineModel | R Documentation |
Builds a BART model for regression or classification.
BARTMachineModel(
num_trees = 50,
num_burn = 250,
num_iter = 1000,
alpha = 0.95,
beta = 2,
k = 2,
q = 0.9,
nu = 3,
mh_prob_steps = c(2.5, 2.5, 4)/9,
verbose = FALSE,
...
)
num_trees |
number of trees to be grown in the sum-of-trees model. |
num_burn |
number of MCMC samples to be discarded as "burn-in". |
num_iter |
number of MCMC samples to draw from the posterior distribution. |
alpha , beta |
base and power hyperparameters in tree prior for whether a node is nonterminal or not. |
k |
regression prior probability that |
q |
quantile of the prior on the error variance at which the data-based estimate is placed. |
nu |
regression degrees of freedom for the inverse |
mh_prob_steps |
vector of prior probabilities for proposing changes to the tree structures: (GROW, PRUNE, CHANGE). |
verbose |
logical indicating whether to print progress information about the algorithm. |
... |
additional arguments to |
binary factor
, numeric
alpha
, beta
, k
, nu
Further model details can be found in the source link below.
In calls to varimp
for BARTMachineModel
, argument
type
may be specified as "splits"
(default) for the
proportion of time each predictor is chosen for a splitting rule or as
"trees"
for the proportion of times each predictor appears in a tree.
Argument num_replicates
is also available to control the number of
BART replicates used in estimating the inclusion proportions [default: 5].
Variable importance is automatically scaled to range from 0 to 100. To
obtain unscaled importance values, set scale = FALSE
. See example
below.
MLModel
class object.
bartMachine
, fit
,
resample
## Requires prior installation of suggested package bartMachine to run
model_fit <- fit(sale_amount ~ ., data = ICHomes, model = BARTMachineModel)
varimp(model_fit, method = "model", type = "splits", num_replicates = 20,
scale = FALSE)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.