KL_divergence: Symmetrised Kullback - Leibler divergence (KL-Divergence)

Description Usage Arguments Value Note Author(s) References

Description

This function calculates Symmetrised Kullback - Leibler divergence (KL-Divergence) between each class. Designed for KLFDA.

Usage

1

Arguments

obj

The KLFDA object. Users can mdify it to adapt your own purpose.

Value

Returns a symmetrised version of the KL divergence between each pair of class

Note

This function is useful for extimating the loss between reduced features and the original features. It has been adopted in TSNE to determine its projection performance.

Author(s)

qinxinghu@gmail.com

References

Van Erven, T., & Harremos, P. (2014). Renyi divergence and Kullback-Leibler divergence. IEEE Transactions on Information Theory, 60(7), 3797-3820.

Pierre Enel (2019). Kernel Fisher Discriminant Analysis (https://www.github.com/p-enel/MatlabKFDA), GitHub.


xinghuq/DA documentation built on July 11, 2021, 8:49 a.m.