autoplot.importance_perm: Visualize importance scores

View source: R/plots.R

autoplot.importance_permR Documentation

Visualize importance scores

Description

Visualize importance scores

Usage

## S3 method for class 'importance_perm'
autoplot(
  object,
  top = Inf,
  metric = NULL,
  eval_time = NULL,
  type = "importance",
  std_errs = stats::qnorm(0.95),
  ...
)

Arguments

object

A tibble of results from importance_perm().

top

An integer for how many terms to show. To define importance when there are multiple metrics, the rankings of predictors are computed across metrics and the average rank is used. In the case of tied rankings, all the ties are included.

metric

A character vector or NULL for which metric to plot. By default, all metrics will be shown via facets. Possible options are the entries in .metric column of the object.

eval_time

For censored regression models, a vector of time points at which the survival probability is estimated.

type

A character value. The default is "importance" which shows the overall signal-to-noise ration (i.e., mean divided by standard error). Alternatively, "difference" shows the mean difference value with standard error bounds.

std_errs

The number of standard errors to plot (when type = "difference").

...

Not used.

Value

A ggplot2 object.

Examples

# Pre-computed results. See code at
system.file("make_imp_example.R", package = "important")

# Load the results
load(system.file("imp_examples.RData", package = "important"))

# A classification model with two classes and highly correlated predictors.
# To preprocess them, PCA feature extraction is used.
#
# Let’s first view the importance in terms of the original predictor set
# using 50 permutations:

imp_orig

autoplot(imp_orig, top = 10)

# Now assess the importance in terms of the PCA components

imp_derv

autoplot(imp_derv)
autoplot(imp_derv, metric = "brier_class", type = "difference")

important documentation built on Aug. 21, 2025, 5:26 p.m.