cbc_compare: Compare multiple choice experiment designs

View source: R/compare.R

cbc_compareR Documentation

Compare multiple choice experiment designs

Description

This function compares multiple CBC designs across key quality metrics including D-error, balance, overlap, and structural characteristics. Useful for evaluating different design methods or parameter settings.

Usage

cbc_compare(..., metrics = "all", sort_by = "d_error", ascending = NULL)

Arguments

...

Any number of cbc_design objects to compare, separated by commas. Can be named for clearer output (e.g., ⁠random = design1, stochastic = design2⁠).

metrics

Character vector specifying which metrics to compare. Options: "structure", "efficiency", "balance", "overlap", or "all" (default). Can specify multiple: c("efficiency", "balance")

sort_by

Character. Metric to sort designs by. Options: "d_error" (default), "balance", "overlap", "profiles_used", "generation_time", or "none"

ascending

Logical. If TRUE, sort in ascending order (lower is better). If FALSE, sort in descending order (higher is better). Default depends on metric.

Value

A cbc_comparison object containing comparison results, printed in a formatted table.

Examples

library(cbcTools)

# Create profiles
profiles <- cbc_profiles(
  price = c(1, 2, 3),
  type = c("A", "B", "C"),
  quality = c("Low", "High")
)

# Create different designs
design_random <- cbc_design(
  profiles = profiles,
  method = "random",
  n_alts = 2, n_q = 4
)

design_stochastic <- cbc_design(
  profiles = profiles,
  method = "stochastic",
  n_alts = 2, n_q = 4
)

# Compare designs
cbc_compare(design_random, design_stochastic)

# Named comparison with specific metrics
cbc_compare(
  Random = design_random,
  Stochastic = design_stochastic,
  metrics = c("efficiency", "balance"),
  sort_by = "d_error"
)

jhelvy/cbcTools documentation built on July 17, 2025, 3:02 a.m.