kdca: Kernel Discriminative Component Analysis

Description Usage Arguments Details Value Note Author(s) References See Also Examples

Description

Performs kernel discriminative component analysis on the given data.

Usage

1
kdca(k, chunks, neglinks, useD)

Arguments

k

n x n kernel matrix. Result of the kmatrixGauss function. n is the number of samples.

chunks

n * 1 vector describing the chunklets: -1 in the i th place means that point i doesn\'t belong to any chunklet; integer j in place i means that point i belongs to chunklet j. The chunklets indexes should be 1:(number of chunklets).

neglinks

s * s matrix describing the negative relationship between all the s chunklets. For the element neglinks_{ij}: neglinks_{ij} = 1 means chunklet i and chunklet j have negative constraint(s); neglinks_{ij} = -1 means chunklet i and chunklet j don\'t have negative constraints or we don\'t have information about that.

useD

optional. When not given, DCA is done in the original dimension and B is full rank. When useD is given, DCA is preceded by constraints based LDA which reduces the dimension to useD. B in this case is of rank useD.

Details

Put KDCA function details here.

Value

list of the KDCA results:

B

KDCA suggested Mahalanobis matrix

DCA

KDCA suggested transformation of the data. The dimension is (original data dimension) * (useD)

newData

KDCA transformed data

Note

Put some note here.

Author(s)

Nan Xiao <https://nanx.me>

References

Steven C.H. Hoi, W. Liu, M.R. Lyu and W.Y. Ma (2006). Learning Distance Metrics with Contextual Constraints for Image Retrieval. Proceedings IEEE Conference on Computer Vision and Pattern Recognition (CVPR2006).

See Also

See kmatrixGauss for the Gaussian kernel computation, and dca for the linear version of DCA.

Examples

1

road2stat/sdml documentation built on May 27, 2019, 10:31 a.m.