Description Usage Arguments Value References Examples
Locally principal components analysis is a variant of PCA that is useful for data sets with clustered observations (Yang, Zhang, & Yang, 2006). LPCA begins with individual observations and finds the structure of the data by considering the data points within a certain distance δ. In this implementation the Minkowski distance is used to find the distances. The Minkowski distance is a generalized distance measure that bridges L1 and L2 norms, as well as higher than L2. This makes it a flexible distance metric that encompasses Manhattan distance and Euclidean distance. This distance based nearest neighbors strategy is used to build an adjacency matrix, which is converted to the Laplacian Graph representation. This is then subjected to eigendecomposition, and the eigenvectors P are multiplied by the data x to obtain the matrix XP. Next, the crossproduct of XP is used to calculate the rotation matrix R, ie, R = XP'XP, and the data are projected onto a new basis. A more complete description of the algorithm can be found in the cited paper. Note that the new basis can have more columns than it does variables, and that the eigenvalues are not ordered due to the local nature of the components.
1 |
x |
a data frame or matrix of numeric variables |
ncomp |
the number of components to extract. |
pct |
this determines the percentage of the distance matrix whose edges will be non-null. the default is 0.15. |
Lp |
a norm for the Minkowski distance metric. defaults to 2. |
scale |
should the variables be scaled prior to analysis? Defaults to TRUE. |
an object of class PrincipalComp
Yang, J., Zhang, D., & Yang, J. (2006). Locally principal component learning for face representation and recognition. Neurocomputing, 69(13-15), 1697–1701. doi:10.1016/j.neucom.2006.01.009
1 | pcaLocal(x, 3)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.