Nothing
Fixes and robustness improvements
parallelize = "variables" mode where empty index chunks could lead to uninitialized results.tryCatch, zero-length handling) to prevent “object 'results' not found” errors during variable-level parallel imputation.ranger and randomForest backends in low-dimensional or degenerate datasets.Expanded the internal test suite with ADAPTS-like stress tests covering:
mixed-type small datasets
missForest() can now fit forests via ranger with backend = "ranger" (default) or retain legacy behavior with backend = "randomForest".backend = c("ranger", "randomForest").num.threads (used by ranger). In parallelize = "variables" mode, per-variable ranger calls use num.threads = 1 to avoid oversubscription.ntree → num.treesnodesize = c(num, fac) → min.bucket (regression/classification respectively)sampsize (counts) → sample.fraction (overall or per-class fractions)classwt → class.weightscutoff emulated via probability forests + post-thresholdingmaxnodes has no exact equivalent and is ignored by ranger (consider max.depth at the ranger level if needed).nodesize = c(5, 1) interpreted consistently across backends (numeric first, factor second).parallelize = "variables": builds per-variable forests in parallel via foreach; ranger runs with num.threads = 1 inside each task.parallelize = "forests": uses ranger’s internal threading (via num.threads) or combines sub-forests for randomForest.stats::predict) and explicitly importing only what’s used.xntree in foreach by localizing/binding the chunk variable in the loop.itertools::isplitVector and iterators::idiv.\link[randomForest]{randomForest} and \link[ranger]{ranger}.bibentry() with person()/c() and DOI.Version: 1.6 and current Date.Authors@R.Imports to include only used packages (e.g., ranger, randomForest, foreach, iterators, itertools, doRNG, stats).missForest, mixError, nrmse, prodNA, varClass.maxnodes is ignored with backend = "ranger"; consider max.depth at the ranger level if tree depth control is required.Any scripts or data that you put into this service are public.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.