Getting Started with NNS: Clustering and Regression"

knitr::opts_chunk$set(echo = TRUE)
require(NNS)
require(knitr)
require(rgl)

Clustering and Regression

Below are some examples demonstrating unsupervised learning with NNS clustering and nonlinear regression using the resulting clusters. As always, for a more thorough description and definition, please view the References.

NNS Partitioning NNS.part

NNS.part is both a partitional and hierarchical clustering method. NNS iteratively partitions the joint distribution into partial moment quadrants, and then assigns a quadrant identification at each partition.

NNS.part returns a data.table of observations along with their final quadrant identification. It also returns the regression points, which are the quadrant means used in NNS.reg.

x = seq(-5, 5, .05); y = x ^ 3

for(i in 1 : 4){NNS.part(x, y, order = i, noise.reduction = "off", Voronoi = TRUE)}

X-only Partitioning

NNS.part offers a partitioning based on $x$ values only, using the entire bandwidth in its regression point derivation, and shares the same limit condition as partitioning via both $x$ and $y$ values.

for(i in 1 : 4){NNS.part(x, y, order = i, type = "XONLY", Voronoi = TRUE)}

Clusters Used in Regression

The right column of plots shows the corresponding regression for the order of NNS partitioning.

for(i in 1 : 3){NNS.part(x, y, order = i, Voronoi = TRUE) ; NNS.reg(x, y, order = i)}

NNS Regression NNS.reg

NNS.reg can fit any $f(x)$, for both uni- and multivariate cases. NNS.reg returns a self-evident list of values provided below.

Univariate:

NNS.reg(x, y, order = 4, noise.reduction = "off")

Multivariate:

Multivariate regressions return a plot of $y$ and $\hat{y}$.

f= function(x, y) x ^ 3 + 3 * y - y ^ 3 - 3 * x
y = x ; z = expand.grid(x, y)
g = f(z[ , 1], z[ , 2])
NNS.reg(z, g, order = "max")

Inter/Extrapolation

NNS.reg can inter- or extrapolate any point of interest. The NNS.reg(x, y, point.est = ...) parameter permits any sized data of similar dimensions to $x$ and called specifically with $Point.est.

Classification

For a classification problem, we simply set NNS.reg(x, y, type = "CLASS", ...)

NNS.reg(iris[ , 1 : 4], iris[ , 5], point.est = iris[1 : 10, 1 : 4], type = "CLASS", location = "topleft")$Point.est

NNS Dimension Reduction Regression

NNS.reg also provides a dimension reduction regression by including a parameter NNS.reg(x, y, dim.red.method = "cor", ...). Reducing all regressors to a single dimension using the returned equation $equation.

NNS.reg(iris[ , 1 : 4], iris[ , 5], dim.red.method = "cor", location = "topleft")$equation

Thus, our model for this regression would be: $$Species = \frac{0.7825612Sepal.Length -0.4266576Sepal.Width + 0.9490347Petal.Length + 0.9565473Petal.Width}{4} $$

Threshold

NNS.reg(x, y, dim.red.method = "cor", threshold = ...) offers a method of reducing regressors further by controlling the absolute value of required correlation.

NNS.reg(iris[ , 1 : 4], iris[ , 5], dim.red.method = "cor", threshold = .75, location = "topleft")$equation

Thus, our model for this further reduced dimension regression would be: $$Species = \frac{0.7825612Sepal.Length -0Sepal.Width + 0.9490347Petal.Length + 0.9565473Petal.Width}{3} $$

and the point.est = (...) operates in the same manner as the full regression above, again called with $Point.est.

NNS.reg(iris[ , 1 : 4], iris[ , 5], dim.red.method = "cor", threshold = .75, point.est = iris[1 : 10, 1 : 4], location = "topleft")$Point.est

References

If the user is so motivated, detailed arguments further examples are provided within the following:



Try the NNS package in your browser

Any scripts or data that you put into this service are public.

NNS documentation built on May 15, 2018, 5:04 p.m.