The Kullback-Leibler distance function can be used to measure the divergence between two correlation matrices. Although originally designed for probability density functions, the literature shows how this can be extended to correlation matrices. By using this function, one can determine objectively the effectiveness of a particular filtering strategy for correlation matrices.
1 2 3 4 5 6
Additional parameters to pass to plot or lines
divergence(h, count, window = NULL, filter, measure = 'information')
divergence.kl(sigma.1, sigma.2) sigma.1 - The sample correlation matrix sigma.2 - The model correlation matrix (aka the filtered matrix)
divergence_lim(m, t = NULL)
stability_lim(m, t = NULL)
divergence.stability(h, count, window, filter) h - A zoo object representing a portfolio with dimensions T x M count - The number of bootstrap observations to create window - The number of samples to include in each observation. Defaults to the anylength of h. filter - The correlation filter to measure m - The number of assets t - The number of samples (dates) in an observation
plotDivergenceLimit.kl(m, t.range, ..., overlay = FALSE) t.range - A range of date samples. This can be a simple interval so long as it matches the number of samples per asset in the measured correlation matrix. overlay - Overlay the divergence limit plot on an existing plot
measure - The type of divergence to calculate. Possible choices are information (default) or stability.
A summary of the results of the divergence calculation including the mean divergence and an effective limit based on a random matrix.
Brian Lee Yung Rowe
1 2 3 4 5 6 7 8 9 10
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.