ghapca: Generalized Hebbian Algorithm for PCA

ghapcaR Documentation

Generalized Hebbian Algorithm for PCA

Description

Online PCA with the GHA of Sanger (1989).

Usage

ghapca(lambda, U, x, gamma, q = length(lambda), center, sort = TRUE)

Arguments

lambda

optional vector of eigenvalues.

U

matrix of eigenvectors (PC) stored in columns.

x

new data vector.

gamma

vector of gain parameters.

q

number of eigenvectors to compute.

center

optional centering vector for x.

sort

Should the new eigenpairs be sorted?

Details

The vector gamma determines the weight placed on the new data in updating each eigenvector (the first coefficient of gamma corresponds to the first eigenvector, etc). It can be specified as a single positive number or as a vector of length ncol(U). Larger values of gamma place more weight on x and less on U. A common choice for (the components of) gamma is of the form c/n, with n the sample size and c a suitable positive constant.
If sort is TRUE and lambda is not missing, the updated eigenpairs are sorted by decreasing eigenvalue. Otherwise, they are not sorted.

Value

A list with components

values

updated eigenvalues or NULL.

vectors

updated eigenvectors.

References

Sanger (1989). Optimal unsupervised learning in a single-layer linear feedforward neural network. Neural Networks.

See Also

sgapca, snlpca

Examples

## Initialization
n <- 1e4  # sample size
n0 <- 5e3 # initial sample size
d <- 10   # number of variables
q <- d # number of PC
x <- matrix(runif(n*d), n, d)
x <- x %*% diag(sqrt(12*(1:d)))
# The eigenvalues of X are close to 1, 2, ..., d
# and the corresponding eigenvectors are close to 
# the canonical basis of R^d

## GHA PCA
pca <- princomp(x[1:n0,])
xbar <- pca$center
pca <- list(values=pca$sdev[1:q]^2, vectors=pca$loadings[,1:q])
for (i in (n0+1):n) {
  xbar <- updateMean(xbar, x[i,], i-1)
  pca <- ghapca(pca$values, pca$vectors, x[i,], 2/i, q, xbar)
}

onlinePCA documentation built on Nov. 15, 2023, 9:07 a.m.