Description Usage Arguments Value References Examples
The Hilbert-Schmidt Independence Criterion (HSIC) is a measure of independence between two random variables. If characteristic kernels are used for both variables, the HSIC is zero iff the variables are independent. In this function, we implement an unbiased estimator for the HSIC measure. Specifically, for two positive-definite kernels K and L and a sample size n, the unbiased HSIC estimator is:
HSIC(K, L) = \frac{1}{n(n-3)} ≤ft[trace(KL) + \frac{1^\top K11^\top L 1}{(n-1)(n-2)}- \frac{2}{n-2}1^\top KL\right]
1 | HSIC(K, L)
|
K |
first kernel similarity matrix |
L |
second kernel similarity matrix |
an unbiased estimate of the HSIC measure.
Song, L., Smola, A., Gretton, A., Borgwardt, K., & Bedo, J. (2007). Supervised Feature Selection via Dependence Estimation. https://doi.org/10.1145/1273496.1273600
1 2 3 4 5 6 7 |
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.