ITML: Information Theoretic Metric Learning (ITML).

Description Usage Arguments Value References

Description

A DML algorithm that learns a metric associated to the nearest gaussian distribution satisfying similarity constraints. The nearest gaussian distribution is obtained minimizing the Kullback-Leibler divergence.

Usage

1
2
3
ITML(initial_metric = NULL, upper_bound = NULL, lower_bound = NULL,
  num_constraints = NULL, gamma = 1, tol = 0.001, max_iter = 1e+05,
  low_perc = 5, up_perc = 95)

Arguments

initial_metric

A positive definite matrix that defines the initial metric used to compare.

upper_bound

Bound for dissimilarity constraints. If NULL, it will be estimated from upper_perc. Float.

lower_bound

Bound for similarity constraints. If NULL, it will be estimated from lower_perc. Float.

num_constraints

Number of constraints to generate. If None, it will be taken as 40 * k * (k-1), where k is the number of classes. Integer.

gamma

The gamma value for slack variables. Float.

tol

Tolerance stop criterion for the algorithm. Float.

max_iter

Maximum number of iterations for the algorithm. Integer.

low_perc

Lower percentile (from 0 to 100) to estimate the lower bound from the dataset. Ignored if lower_bound is provided. Integer.

up_perc

Upper percentile (from 0 to 100) to estimate the upper bound from the dataset. Ignored if upper_bound is provided. Integer.

Value

The ITML transformer, structured as a named list.

References

Jason V Davis et al. “Information-theoretic metric learning”. In: Proceedings of the 24th international conference on Machine learning. ACM. 2007, pages. 209-216.


jlsuarezdiaz/rDML documentation built on May 24, 2019, 12:35 a.m.