sfa performs Slow Feature Analysis (SFA) on a
K-dimensional time series with T observations.
Important: This implementation of SFA is just the most basic
version; it is merely included here for convenience in
initialize_weightvector. If you want to use SFA in R please
use the rSFA package, which has many more advanced and efficient implementations
sfa() here corresponds to
sfa1 in the rSFA package.
a T \times K array with
Slow Feature Analysis (SFA) finds slow signals (see References below), and can be quickly (and analytically) computed solving a generalized eigen-value problem. For ForeCA it is important to know that SFA is equivalent to finding the signal with largest lag 1 autocorrelation.
The disadvantage of SFA for forecasting is that, e.g., white noise (WN) is ranked higher than an AR(1) with negative autocorrelation coefficient ρ_1 < 0. While it is true that WN is slower, it is not more forecastable. Thus we are also interested in the fastest signal, i.e., the last eigenvector. The so obtained fastest signal corresponds to minimizing the lag 1 auto-correlation (possibly ρ_1 < 0).
Note though that maximizing (or minimizing) the lag 1 auto-correlation does
not necessarily yield the most forecastable signal (as measured
Omega), but it is a good start.
An object of class
sfa which inherits methods from
Signals are ordered from slowest to fastest.
Laurenz Wiskott and Terrence J. Sejnowski (2002). “Slow Feature Analysis: Unsupervised Learning of Invariances”, Neural Computation 14:4, 715-770.
1 2 3 4 5 6 7 8 9
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.