kpca | R Documentation |
Kernel Principal Components Analysis is a nonlinear form of principal component analysis.
## S4 method for signature 'formula'
kpca(x, data = NULL, na.action, ...)
## S4 method for signature 'matrix'
kpca(x, kernel = "rbfdot", kpar = list(sigma = 0.1),
features = 0, th = 1e-4, na.action = na.omit, ...)
## S4 method for signature 'kernelMatrix'
kpca(x, features = 0, th = 1e-4, ...)
## S4 method for signature 'list'
kpca(x, kernel = "stringdot", kpar = list(length = 4, lambda = 0.5),
features = 0, th = 1e-4, na.action = na.omit, ...)
x |
the data matrix indexed by row or a formula describing the
model, or a kernel Matrix of class |
data |
an optional data frame containing the variables in the model (when using a formula). |
kernel |
the kernel function used in training and predicting. This parameter can be set to any function, of class kernel, which computes a dot product between two vector arguments. kernlab provides the most popular kernel functions which can be used by setting the kernel parameter to the following strings:
The kernel parameter can also be set to a user defined function of class kernel by passing the function name as an argument. |
kpar |
the list of hyper-parameters (kernel parameters). This is a list which contains the parameters to be used with the kernel function. Valid parameters for existing kernels are :
Hyper-parameters for user defined kernels can be passed through the kpar parameter as well. |
features |
Number of features (principal components) to return. (default: 0 , all) |
th |
the value of the eigenvalue under which principal components are ignored (only valid when features = 0). (default : 0.0001) |
na.action |
A function to specify the action to be taken if |
... |
additional parameters |
Using kernel functions one can efficiently compute
principal components in high-dimensional
feature spaces, related to input space by some non-linear map.
The data can be passed to the kpca
function in a matrix
or a
data.frame
, in addition kpca
also supports input in the form of a
kernel matrix of class kernelMatrix
or as a list of character
vectors where a string kernel has to be used.
An S4 object containing the principal component vectors along with the corresponding eigenvalues.
pcv |
a matrix containing the principal component vectors (column wise) |
eig |
The corresponding eigenvalues |
rotated |
The original data projected (rotated) on the principal components |
xmatrix |
The original data matrix |
all the slots of the object can be accessed by accessor functions.
The predict function can be used to embed new data on the new space
Alexandros Karatzoglou
alexandros.karatzoglou@ci.tuwien.ac.at
Schoelkopf B., A. Smola, K.-R. Mueller :
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation 10, 1299-1319
\Sexpr[results=rd]{tools:::Rd_expr_doi("10.1162/089976698300017467")}.
kcca
, pca
# another example using the iris
data(iris)
test <- sample(1:150,20)
kpc <- kpca(~.,data=iris[-test,-5],kernel="rbfdot",
kpar=list(sigma=0.2),features=2)
#print the principal component vectors
pcv(kpc)
#plot the data projection on the components
plot(rotated(kpc),col=as.integer(iris[-test,5]),
xlab="1st Principal Component",ylab="2nd Principal Component")
#embed remaining points
emb <- predict(kpc,iris[test,-5])
points(emb,col=as.integer(iris[test,5]))
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.