knitr::read_chunk("../analysis/chunks.R")



Introduction

The $\varepsilon-$complexity of a time series is computed by successively downsampling and approximating the original time series. The initial implementation of the $\varepsilon-$complexity procedure used piecewise polynomials as the approximation method. We implemented the procedure in the R language using three approximation methods, B-splines, cubic splines, and interpolating subdivision. The spline methods use existing libraries. The interpolating subdivision was implemented based on Sweldens[^1].

The purpose of adding additional approximation methods was to improve the performance of the complexity coefficients in segmentation and classification tasks. We found that the most accurate approximation method -- B-splines with a large number of basis elements -- did not perform better than cubic splines or the lifting method in classification tasks. Because all methods performed similarly, the computationally efficient implementation of cubic splines was used in later applications.

knitr::read_chunk("../code/sim-example.R")
<<sim-example-create>>
<<sim-series>>

Diagnostic plots

In the theory developed by Piryatinska and Darkhovsky[^2], the $log-log$ regression of approximation errors taken at each downsampling level $S$ should be roughly linear. (An outline of the estimation procedure is here.) The two parameters defining this linear fit are the complexity coefficients.The implementation of the three approximation methods was checked to determine that this linear relationship held for a set our set of simulations. The plot shows the diagnostic plots for the interpolating subdivision method.

<<sim-example-plot>>

Approximation method performance

A plot of the simulations in the feature space of the two complexity coefficients shows that the complexity coefficients generated by each method lie in roughly the same area in the coefficient space. The same processes -- ARMA, logistic map, and fBM -- also appear linearly separable for each method. The plot shows that, roughly, all three methods produce similar estimations of the complexity coefficients.

knitr::read_chunk("../code/ecomplex-plot-feature-space.R")
devtools::load_all()
<<sim-feature-space>>

For the following test two groups were formed consisting of 30 simulations of each random process or function. Each process had an initial set of parameters and samples of the process were generated by randomly perturbing these parameters within a small window. As the table above shows, the B-spline method produced the lowest total approximation error. The 'Combined' method, which used the smallest approximation at each downsample level, mirrors the B-spline results.

<<ecomplex-feature-space>>
  devtools::load_all()
  prefix = "ecomplex"

  err_means <- readRDS(cache_file("all_errs", prefix))
  rf_errs <- readRDS(cache_file("rf_errs", prefix))

  err_means <- data.frame(err_means)
  names(err_means) <- names(rf_errs)
  row.names(err_means) <- row.names(rf_errs)
  knitr::kable(err_means, caption = 'Approximation error for each method.')

The complexity coefficients were used as features in a random forest classifier, the same classifier we used in later applications. No approximation method was clearly dominant but the methods with higher approximation errors performed as well or better than the B-spline method. All three approximation methods are similar -- essentially, the basis elements reproduce lower order polynomials -- so the result is not too suprising. On the other hand, the results do indicate that improved approximation does not necessarily lead to improved performance in classification tasks that use the complexity coefficients as classifiers.

Additional work could test whether these results hold for simulations run over a larger range of parameters, or if different classifiers yield similar results.

  knitr::kable(rf_errs, caption = 'Classification error for each approximation method.')
````

[^1]: [Build your own wavelets at home](https://link.springer.com/chapter/10.1007%2FBFb0011093?LI=true)
[^2]: [Epsilon-complexity of continuous functions](https://arxiv.org/abs/1303.1777)


### Session information

<!-- Insert the session information into the document -->
```r


nateaff/eeg-complex documentation built on May 14, 2019, 2:55 p.m.