kpcca | R Documentation |
Using must-link and cannot-link constaints, KPCCA (Mignon & Jury, 2012) learns a projection into a
low-dimensional space where the distances between pairs of data points respect the desired constraints,
exhibiting good generalization properties in presence of high dimensional data. This is a kernelized
version of pcca
.
kpcca(K, d1, ML, CL, beta = 1, epsi = 1e-04, etamax = 0.1, disp = TRUE)
K |
Gram matrix of size n*n |
d1 |
Number of extracted features. |
ML |
Matrix nbML x 2 of must-link constraints. Each row of ML contains the indices of objects that belong to the same class. |
CL |
Matrix nbCL x 2 of cannot-link constraints. Each row of CL contains the indices of objects that belong to different classes. |
beta |
Sharpness parameter in the loss function (default: 1). |
epsi |
Minimal rate of change of the cost function (default: 1e-4). |
etamax |
Maximum step in the line search algorithm (default: 0.1). |
disp |
If TRUE (default), intermediate results are displayed. |
A list with three attributes:
The n*d1 matrix of extracted features.
The projection matrix of size d1*n.
The Euclidean distance matrix in the projected space.
Thierry Denoeux.
A. Mignon and F. Jurie. PCCA: a new approach for distance learning from sparse pairwise constraints. In 2012 IEEE Conference on Computer Vision and Pattern Recognition, pages 2666-2672, 2012.
pcca
, create_MLCL
## Not run:
library(kernlab)
data<-bananas(400)
plot(data$x,pch=data$y,col=data$y)
const<-create_MLCL(data$y,1000)
rbf <- rbfdot(sigma = 0.2)
K<-kernelMatrix(rbf,data$x)
res.kpcca<-kpcca(K,d1=1,ML=const$ML,CL=const$CL,beta=1)
plot(res.kpcca$z,col=data$y)
## End(Not run)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.