plot.som: Plot a som object

Description Usage Arguments Details Author(s) References See Also Examples

Description

Plot a som object, several plot can be drawn, see below.

Usage

1
plot.som( object, type = "effectif", cex = 0.75, nom.variable , precision = 4, cex.label = 0.75 )

Arguments

object

a som object

type

character specifying the type of the graph. Possible values are : [distance | mean.ss | effectif | meta | variable | energy | dxdy] (see details)

cex

character expansion of titles and legend.

nom.variable

It has no effect if type is different from 'variable'. Name of the variable of weights to plot.

precision

Precision of the displayed label if it is numeric

cex.label

character expansion of the displayed info in each polygon of the network.

Details

\item

distanceeuclidean distances between a class and all the other classes. Change the reference class with <s-expression>set.current.case</s-expression>. \itemmean.ssmean sums of squares for each prototype. \itemeffectifnumber of rows classified in each prototype. \itemmetameta-groups (see <s-expression>plot.clust</s-expression> for precisions) \itemvariableweights values for a variable in each prototype. It allows to see if a variable if used by the network or not (if not, values are quite equals). \itemenergyplot the energy function during the learning process. This represent the intra-inertia extended to the neighborhood at a stage of the learning process. This is the objective function to be minimised during learning \itemdxdyThe quality of a projection can be evaluated by the 'dy-dx' representation. It is a plot of all the possible distances in the input space, dx's, versus their respective distances in the output space, dy's. For a linear projection the 'dy-dx' plot should be linear.

Author(s)

David Gohel

References

Demartines, P. and J. Herault. Curvilinear component analysis: A self-organizing neural network for nonlinear mapping of data sets. Kohonen, T. (1995). Self-Organizing Maps

See Also

som learn biplot.som

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
library(MASS)
lcrabs <- log(crabs[, 4:8])

lcrabs.som <- som ( formula = ~ . , data = lcrabs
	, neighborhood = "uniform"
	, grid = grid ( xdim = 10 , ydim = 10 , type = "hexagonal" ) 
	, weights.min = min (lcrabs), weights.max = max (lcrabs)
	)
lcrabs.som <- learn( lcrabs.som , number.iter = 500 , max.alpha = 0.5, min.alpha = .001, max.rayon = 3 , step.eval.si = 50)



plot(lcrabs.som, "energy")
#--- quality of learn, points must be higly correlated and in a line from c(0, 0) -> c(1, 1)
plot ( lcrabs.som , "dxdy" )
#--- plot number of cases per neuron/class
plot(lcrabs.som, "effectif", cex.label = 0)
#--- plot means sums of squares
plot(lcrabs.som, "mean.ss", cex.label = 0.3)
#--- change current case
lcrabs.som <- set.current.case(lcrabs.som, 4)
plot(lcrabs.som, "distance", cex = .75, cex.label = 0)



#--- plot values for a variable... useful for selecting active varaible
plot(lcrabs.som, "variable")
plot(lcrabs.som, "variable", nom.variable = "FL")

## Not run: 
#--- construct meta class , click on the graph, the y will be used to collapse classes in meta-classes ---#
lcrabs.som <- plot.clust(lcrabs.som, interactive = T)

## End(Not run)

#--- plot meta class ---#
plot(lcrabs.som, "meta" , cex.label = .5 )

harrysouthworth/kohonen documentation built on May 17, 2019, 3:03 p.m.