Description Usage Arguments Value Author(s) References Examples
Estimates the Kullback-Leibler Divergence which measures how one probability distribution diverges from the original distribution (equivalent means are assumed) Matrices must be positive definite inverse covariance matrix for accurate measurement. This is a relative metric
1 |
base |
Full or base model |
test |
Reduced or testing model |
A value greater than 0. Smaller values suggest the probability distribution of the reduced model is near the full model
Alexander Christensen <alexpaulchristensen@gmail.com>
Kullback, S., & Leibler, R. A. (1951). On information and sufficiency. The Annals of Mathematical Statistics, 22, 79-86.
1 2 3 4 5 6 7 8 |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.