Description Usage Arguments Value References
A DML algorithm that learns a metric associated to the nearest gaussian distribution satisfying similarity constraints. The nearest gaussian distribution is obtained minimizing the Kullback-Leibler divergence.
1 2 3 |
initial_metric |
A positive definite matrix that defines the initial metric used to compare. |
upper_bound |
Bound for dissimilarity constraints. If NULL, it will be estimated from upper_perc. Float. |
lower_bound |
Bound for similarity constraints. If NULL, it will be estimated from lower_perc. Float. |
num_constraints |
Number of constraints to generate. If None, it will be taken as 40 * k * (k-1), where k is the number of classes. Integer. |
gamma |
The gamma value for slack variables. Float. |
tol |
Tolerance stop criterion for the algorithm. Float. |
max_iter |
Maximum number of iterations for the algorithm. Integer. |
low_perc |
Lower percentile (from 0 to 100) to estimate the lower bound from the dataset. Ignored if lower_bound is provided. Integer. |
up_perc |
Upper percentile (from 0 to 100) to estimate the upper bound from the dataset. Ignored if upper_bound is provided. Integer. |
The ITML transformer, structured as a named list.
Jason V Davis et al. “Information-theoretic metric learning”. In: Proceedings of the 24th international conference on Machine learning. ACM. 2007, pages. 209-216.
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.