kpca | R Documentation |
Kernel PCA (Scholkopf et al. 1997, Scholkopf & Smola 2002, Tipping 2001) by SVD factorization of the weighted Gram matrix D^(1/2) * Phi(X) * Phi(X)' * D^(1/2)
. D
is a (n, n
) diagonal matrix of weights for the observations (rows of X
).
kpca(X, weights = NULL, nlv, kern = "krbf", ...)
## S3 method for class 'Kpca'
transform(object, X, ..., nlv = NULL)
## S3 method for class 'Kpca'
summary(object, ...)
X |
For the main functions: Training X-data ( |
weights |
Weights ( |
nlv |
The number of PCs to calculate. |
kern |
Name of the function defining the considered kernel for building the Gram matrix. See |
object |
A fitted model, output of a call to the main functions. |
... |
Optional arguments to pass in the kernel function defined in |
See the examples.
Scholkopf, B., Smola, A., Müller, K.-R., 1997. Kernel principal component analysis, in: Gerstner, W., Germond, A., Hasler, M., Nicoud, J.-D. (Eds.), Artificial Neural Networks â ICANN 97, Lecture Notes in Computer Science. Springer, Berlin, Heidelberg, pp. 583-588. https://doi.org/10.1007/BFb0020217
Scholkopf, B., Smola, A.J., 2002. Learning with kernels: support vector machines, regularization, optimization, and beyond, Adaptive computation and machine learning. MIT Press, Cambridge, Mass.
Tipping, M.E., 2001. Sparse kernel principal component analysis. Advances in neural information processing systems, MIT Press. http://papers.nips.cc/paper/1791-sparse-kernel-principal-component-analysis.pdf
n <- 5 ; p <- 4
X <- matrix(rnorm(n * p), ncol = p)
nlv <- 3
kpca(X, nlv = nlv, kern = "krbf")
fm <- kpca(X, nlv = nlv, kern = "krbf", gamma = .6)
fm$T
transform(fm, X[1:2, ])
transform(fm, X[1:2, ], nlv = 1)
summary(fm)
## Usual PCA
nlv <- 3
pcasvd(X, nlv = nlv)$T
kpca(X, nlv = nlv, kern = "kpol")$T
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.