View source: R/VariableImportancePlot.R
VariableImportancePlot | R Documentation |
Barplot comparing the feature importance across different learning methods.
VariableImportancePlot(DT = NULL, RF = NULL, GBM = NULL)
DT |
A fitted decision tree model object |
RF |
A fitted random forest model object |
GBM |
A fitted gradient boosting model object |
This function returns a barplot that compares the standardized feature importance across different tree-based machine learning methods. These measures are computed via the caret package.
library(gbm) colnames(training)[14] <- "perf" ensemblist <- TreeModels(traindata = training, methodlist = c("dt", "rf","gbm"),checkprogress = TRUE) VariableImportancePlot(DT = ensemblist$ModelObject$rpart, RF = ensemblist$ModelObject$ranger,GBM = ensemblist$ModelObject$gbm) VariableImportancePlot(RF = ensemblist$ModelObject$ranger, GBM = ensemblist$ModelObject$gbm) VariableImportancePlot(DT = ensemblist$ModelObject$rpart)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.