Elm.search.hc: Search Number of Hidden Neurons Using Hill Climbing

Description Usage Arguments Value References Examples

Description

Finds the number of hidden neurons using the hill climbing procedure.

Usage

1
2
3
Elm.search.hc(X.fit, Y.fit, n.ensem= 10, n.blocks=5, ErrorFunc=RMSE, PercentValid=20,
              maxHiddenNodes = NULL, Trace=TRUE, autorangeweight=FALSE, rangeweight=1, 
              activation='TANH',outputBias = FALSE,rangebias = 1)

Arguments

X.fit

Data matrix (numeric) containing the input values (predictors) used to train the model.

Y.fit

Response vector (numeric) used to train the model.

n.ensem

Number of ensemble members. Default is 10.

n.blocks

an integer specifying the desired number of cross-validation folds. Default is 5.

ErrorFunc

Error function to be minimized. The default is the function MSE from package qualV but the function could be customized.

PercentValid

Percentage of the data reserved for validation (if n.blocks < 2. Default is 20%.

maxHiddenNodes

Maximum number of hidden nodes. Default is NULL which means number of cases -1.

Trace

If TRUE, information is printed during the running of svres. Default is TRUE.

autorangeweight

Option whether to use the automated range used for the weights. Default is FALSE.

rangeweight

Initial random weights on [-rangeweight,rangeweight]. The default is 1

activation

Activation function of the hidden layer neurons. Available functions are: 'TANH' (default) and 'SIG'.

outputBias

Option whether to use the bias parameter in the output layer

rangebias

Initial random bias on [-rangebias,rangebias]. The default is 1

Value

The best number of hidden neurons found by the automatic procedure.

References

Lima, A.R., A.J. Cannon and W.W. Hsieh. Nonlinear regression in environmental sciences using extreme learning machines. Environmental Modelling and Software (submitted 2014/2/3)

Yuret, D., 1994. From genetic algorithms to efficient optimization. Technical Report 1569. MIT AI Laboratory.

Examples

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
set.seed(123)
attach(wtloss)
library("scales")

#scaling the inputs/outputs
x.train <- rescale(as.matrix(wtloss$Days), to=c(-1,1))
y.train <- rescale(as.matrix(wtloss$Weight), to=c(-1,1))

#Finding the best number of hidden neurons
number.hn <- Elm.search.hc(x.train,y.train)

#training the ELM
trained.elm <- Elm.train(x.train,y.train,Number.hn = number.hn)

#rescaling back the elm outputs
elm.fit.values <- rescale(trained.elm$predictionTrain,to= range(as.matrix(wtloss$Weight)),from=c(-1,1))

oldpar <- par(mar = c(5.1, 4.1, 4.1, 4.1))
plot(wtloss$Days, wtloss$Weight, type = "p", ylab = "Weight (kg)",main="Weight Reduction")
lines(wtloss$Days, elm.fit.values,col=2,type='b')

ForAI documentation built on May 2, 2019, 6:14 p.m.