View source: R/gpb.model.dt.tree.R
| gpb.model.dt.tree | R Documentation |
Parse a GPBoost model json dump into a data.table structure.
gpb.model.dt.tree(model, num_iteration = NULL)
model |
object of class |
num_iteration |
number of iterations you want to predict with. NULL or <= 0 means use best iteration |
A data.table with detailed information about model trees' nodes and leafs.
The columns of the data.table are:
tree_index: ID of a tree in a model (integer)
split_index: ID of a node in a tree (integer)
split_feature: for a node, it's a feature name (character);
for a leaf, it simply labels it as "NA"
node_parent: ID of the parent node for current node (integer)
leaf_index: ID of a leaf in a tree (integer)
leaf_parent: ID of the parent node for current leaf (integer)
split_gain: Split gain of a node
threshold: Splitting threshold value of a node
decision_type: Decision type of a node
default_left: Determine how to handle NA value, TRUE -> Left, FALSE -> Right
internal_value: Node value
internal_count: The number of observation collected by a node
leaf_value: Leaf value
leaf_count: The number of observation collected by a leaf
data(agaricus.train, package = "gpboost")
train <- agaricus.train
dtrain <- gpb.Dataset(train$data, label = train$label)
params <- list(
objective = "binary"
, learning_rate = 0.01
, num_leaves = 63L
, max_depth = -1L
, min_data_in_leaf = 1L
, min_sum_hessian_in_leaf = 1.0
)
model <- gpb.train(params, dtrain, 10L)
tree_dt <- gpb.model.dt.tree(model)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.