Description Usage Arguments Details Value Author(s) References See Also Examples
Fits an ICA model by directly estimating the densities of the independent components using Poisson GAMs. The densities have the form of tilted Gaussians, and hense directly estimate the contrast functions that lead to negentropy measures. This function supports Section 14.7.4 of 'Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2009, 2nd Edition)'. Models include 'FastICA'.
1 2 |
x |
input matrix |
k |
Number of components required, less than or equal to the number of columns of x |
W0 |
Optional initial matrix (for comparing algorithms) |
whiten |
Logical variable - should x be whitened. If TRUE, the SVD of X=UDV' is computed, and U is used (up to rank(X) columns). Also k is reduced to min(k,rank(X)). If FALSE (default), it is assumed that the user has pre-whitened x (and if not, the function may not perform properly) |
maxit |
Maximum number of iterations; default is 20 |
thresh |
Convergence threshold, in terms of relative change in Amari metric; dfault is 1e-7 |
restarts |
Number of random restarts; default is 0 |
trace |
Trace iterations; default is FALSE |
Gfunc |
Contrast functional which is basis for negentropy
measure. Default is |
eps.rank |
Threshold for deciding rank of x if option
|
... |
Additional arguments for |
See Section 14.7.4 of Elements of Statistical Learning (Hastie, Tibshirani and Friedman, 2009, 2nd Edition)
An object of S3 class "ProDenICA"
is returned, with the following components:
W |
Orthonormal matrix that takes the whitened version of x to the independent components |
negentropy |
The total negentropy measure of this solution |
s |
the matrix of k independent components |
whitner |
if |
call |
the call that produced this object |
density |
If |
Trevor Hastie and Rob Tibshirani
Hastie, T. and Tibshirani, R. (2003) Independent Component Analysis
through Product Density Estimation in Advances in Neural Information
Processing Systems 15 (Becker, S. and Obermayer, K., eds), MIT Press,
Cambridge, MA. pp 649-656
Hastie, T., Tibshirani, R. and Friedman, J. (2009) Elements of
Statistical Learning (2nd edition), Springer.
http://www-stat.stanford.edu/~hastie/Papers/ESLII.pdf
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 | p=2
### Can use letters a-r below for dist
dist="n"
N=1024
A0<-mixmat(p)
s<-scale(cbind(rjordan(dist,N),rjordan(dist,N)))
x <- s %*% A0
###Whiten the data
x <- scale(x, TRUE, FALSE)
sx <- svd(x) ### orthogonalization function
x <- sqrt(N) * sx$u
target <- solve(A0)
target <- diag(sx$d) %*% t(sx$v) %*% target/sqrt(N)
W0 <- matrix(rnorm(2*2), 2, 2)
W0 <- ICAorthW(W0)
W1 <- ProDenICA(x, W0=W0,trace=TRUE,Gfunc=G1)$W
fit=ProDenICA(x, W0=W0,Gfunc=GPois,trace=TRUE, density=TRUE)
W2 <- fit$W
#distance of FastICA from target
amari(W1,target)
#distance of ProDenICA from target
amari(W2,target)
par(mfrow=c(2,1))
plot(fit)
|
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.