initnw | R Documentation |
Function to initialize the weights and biases in a neural network. It uses the Nguyen-Widrow (1990) algorithm.
initnw(neurons,p,n,npar)
neurons |
Number of neurons. |
p |
Number of predictors. |
n |
Number of cases. |
npar |
Number of parameters to be estimate including only weights and biases, and should be equal to neurons*(1+1+p)+1. |
The algorithm is described in Nguyen-Widrow (1990) and in other books, see for example Sivanandam and Sumathi (2005). The algorithm is briefly described below.
1.-Compute the scaling factor theta=0.7*p^(1/n).
2.- Initialize the weight and biases for each neuron at random, for example generating random numbers from U(-0.5,0.5).
3.- For each neuron:
compute η_k=√{∑_{j=1}^p (β_j^{(k)})^2},
update (β_1^{(k)},...,β_p^{(k)})',
β_j^{(k)}=\frac{θ β_j^{(k)}}{η_k}, j=1,...,p,
Update the bias (b_k) generating a random number from U(-theta,theta).
A list containing initial values for weights and biases. The first s components of the list contains vectors with the initial values for the weights and biases of the k-th neuron, i.e. (ω_k, b_k, β_1^{(k)},...,β_p^{(k)})'.
Nguyen, D. and Widrow, B. 1990. "Improving the learning speed of 2-layer neural networks by choosing initial values of the adaptive weights", Proceedings of the IJCNN, 3, 21-26.
Sivanandam, S.N. and Sumathi, S. 2005. Introduction to Neural Networks Using MATLAB 6.0. Ed. McGraw Hill, First edition.
## Not run: #Load the library library(brnn) #Set parameters neurons=3 p=4 n=10 npar=neurons*(1+1+p)+1 initnw(neurons=neurons,p=p,n=n,npar=npar) ## End(Not run)
Add the following code to your website.
For more information on customizing the embed code, read Embedding Snippets.