jgabry/loo: Efficient Leave-One-Out Cross-Validation and WAIC for Bayesian Models

Efficient approximate leave-one-out cross-validation (LOO) for Bayesian models fit using Markov chain Monte Carlo. The approximation uses Pareto smoothed importance sampling (PSIS), a new procedure for regularizing importance weights. As a byproduct of the calculations, we also obtain approximate standard errors for estimated predictive errors and for the comparison of predictive errors between models. The package also provides methods for using stacking and other model weighting techniques to average Bayesian predictive distributions.

Getting started

Package details

MaintainerJonah Gabry <[email protected]>
LicenseGPL (>=3)
URL http://mc-stan.org http://discourse.mc-stan.org
Package repositoryView on GitHub
Installation Install the latest version of this package by entering the following in R:
jgabry/loo documentation built on Nov. 4, 2018, 10:43 p.m.