varimp | R Documentation |
Calculate measures of relative importance for model predictor variables.
varimp(
object,
method = c("permute", "model"),
scale = TRUE,
sort = c("decreasing", "increasing", "asis"),
...
)
object |
model fit result. |
method |
character string specifying the calculation of variable
importance as permutation-base ( |
scale |
logical value or vector indicating whether importance values are scaled to a maximum of 100. |
sort |
character string specifying the sort order of importance values
to be |
... |
arguments passed to model-specific or permutation-based variable
importance functions. These include the following arguments and default
values for
|
The varimp
function supports calculation of variable importance with
the permutation-based method of Fisher et al. (2019) or with model-based
methods where defined. Permutation-based importance is the default and has
the advantages of being available for any model, any performance metric
defined for the associated response variable type, and any predictor variable
in the original training dataset. Conversely, model-specific importance is
not defined for some models and will fall back to the permutation method in
such cases; is generally limited to metrics implemented in the source
packages of models; and may be computed on derived, rather than original,
predictor variables. These disadvantages can make comparisons of
model-specific importance across different classes of models infeasible. A
downside of the permutation-based approach is increased computation time. To
counter this, the permutation algorithm can be run in parallel simply by
loading a parallel backend for the foreach package %dopar%
function, such as doParallel or doSNOW.
Permutation variable importance is interpreted as the contribution of a predictor variable to the predictive performance of a model as measured by the performance metric used in the calculation. Importance of a predictor is conditional on and, with the default scaling, relative to the values of all other predictors in the analysis.
VariableImportance
class object.
Fisher, A., Rudin, C., & Dominici, F. (2019). All models are wrong, but many are useful: Learning a variable's importance by studying an entire class of prediction models simultaneously. Journal of Machine Learning Research, 20, 1-81.
plot
## Requires prior installation of suggested package gbm to run
## Survival response example
library(survival)
gbm_fit <- fit(Surv(time, status) ~ ., data = veteran, model = GBMModel)
(vi <- varimp(gbm_fit))
plot(vi)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.