Computes the relative influence of each variable in the gbm object.
1 2 3 4 5 6 7 8 9 
object 
a 
cBars 
the number of bars to plot. If 
n.trees 
the number of trees used to generate the plot. Only the first

plotit 
an indicator as to whether the plot is generated. 
order 
an indicator as to whether the plotted and/or returned relative influences are sorted. 
method 
The function used to compute the relative influence.

normalize 
if 
... 
other arguments passed to the plot function. 
For distribution="gaussian"
this returns exactly the reduction
of squared error attributable to each variable. For other loss functions this
returns the reduction attributeable to each varaible in sum of squared error in
predicting the gradient on each iteration. It describes the relative influence
of each variable in reducing the loss function. See the references below for
exact details on the computation.
Returns a data frame where the first component is the variable name and the second is the computed relative influence, normalized to sum to 100.
Greg Ridgeway gregridgeway@gmail.com
J.H. Friedman (2001). "Greedy Function Approximation: A Gradient Boosting Machine," Annals of Statistics 29(5):11891232.
L. Breiman (2001).Random Forests.
Questions? Problems? Suggestions? Tweet to @rdrrHQ or email at ian@mutexlabs.com.
Please suggest features or report bugs with the GitHub issue tracker.
All documentation is copyright its authors; we didn't write any of that.