summary.gbm | R Documentation |
Computes the relative influence of each variable in the gbm object.
## S3 method for class 'gbm'
summary(
object,
cBars = length(object$var.names),
n.trees = object$n.trees,
plotit = TRUE,
order = TRUE,
method = relative.influence,
normalize = TRUE,
...
)
object |
a |
cBars |
the number of bars to plot. If |
n.trees |
the number of trees used to generate the plot. Only the first
|
plotit |
an indicator as to whether the plot is generated. |
order |
an indicator as to whether the plotted and/or returned relative influences are sorted. |
method |
The function used to compute the relative influence.
|
normalize |
if |
... |
other arguments passed to the plot function. |
For distribution="gaussian"
this returns exactly the reduction of
squared error attributable to each variable. For other loss functions this
returns the reduction attributable to each variable in sum of squared error
in predicting the gradient on each iteration. It describes the relative
influence of each variable in reducing the loss function. See the references
below for exact details on the computation.
Returns a data frame where the first component is the variable name and the second is the computed relative influence, normalized to sum to 100.
Greg Ridgeway gregridgeway@gmail.com
J.H. Friedman (2001). "Greedy Function Approximation: A Gradient Boosting Machine," Annals of Statistics 29(5):1189-1232.
L. Breiman (2001).https://www.stat.berkeley.edu/users/breiman/randomforest2001.pdf.
gbm
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.