build_MLP: Build SLP/MLP architecture

View source: R/deepMLP.r

build_MLPR Documentation

Build SLP/MLP architecture

Description

build_MLP creates a sequential feedforward model (SLP, MLP) with stacked dense layers and optional dropout layers.

Usage

build_MLP(
  features,
  hidden = NULL,
  dropout = NULL,
  output = list(1, "linear"),
  loss = "mean_squared_error",
  optimizer = "adam",
  metrics = c("mean_absolute_error")
)

Arguments

features

Number of features, e.g. returned by nunits.

hidden

A data frame with two columns whereby the first column contains the number of hidden units and the second column the activation function. The number of rows determines the number of hidden layers.

dropout

A numeric vector with dropout rates, the fractions of input units to drop or NULL if no dropout is desired.

output

A list with two elements whereby the first element determines the number of output units, e.g. returned by nunits, and the second element the output activation function.

loss

Name of objective function or objective function. If the model has multiple outputs, different loss on each output can be used by passing a dictionary or a list of objectives. The loss value that will be minimized by the model will then be the sum of all individual losses.

optimizer

Name of optimizer or optimizer instance.

metrics

Vector or list of metrics to be evaluated by the model during training and testing.

Value

A model object with stacked dense layers and dropout layers.

See Also

nunits, keras_model_sequential, layer_dense, layer_dropout, compile.keras.engine.training.Model.

Other Single & Multi Layer Perceptron (SLP, MLP): as_MLP_X(), as_MLP_Y(), as_tensor_1d(), as_tensor_2d(), as_tensor_3d(), fit_MLP(), load_weights_ANN(), nsamples(), nsubsequences(), ntimesteps(), nunits(), predict_ANN(), save_weights_ANN()


stschn/deepANN documentation built on June 25, 2024, 7:27 a.m.