FunctionalModel.mlp: Create a Multi-Layer Perceptron Model with the Given Layers...

Description Usage Arguments Value

View source: R/mlp.R


With this function, we can create a multi-layer perceptron model with the specified layers configuration and activation function. Each neuron has a bias parameter plus n weights for all incoming connections from the previous layer. The output of the last layer is also weighted and added to a bias parameter.


FunctionalModel.mlp(layers, decreasing = FALSE, func = "tanh",
  func.mode = 1L)



the vector of layers, e.g., c(1) for a perceptron with a single input=output node, 1, 2, 1 with an input node, two hidden nodes, and an output node.


if TRUE, then an MLP is generated whose parameter range is limited such that only a monotonously decreasing function can be represented. If FALSE, a general MLP which can represent arbitrary function shapes (within its degrees of freedom) is produced.


the activation function


Only relevant if decreasing==TRUE: If we want to generate a monotonously decreasing perceptron, then we need to know whether the activation function func is itself monotnously decreasing (mode==-1L) or increasing (model==1L). The default tanh function is increasing.


a functional model representing the given perceptron

thomasWeise/regressoR.functional.models documentation built on July 23, 2018, 4:33 p.m.