dnnet: Multilayer Perceptron Model for Regression or Classification

Description Usage Arguments Value See Also

View source: R/2-2-dnnet.R

Description

Fit a Multilayer Perceptron Model for Regression or Classification

Usage

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
dnnet(train, validate = NULL, norm.x = TRUE,
  norm.y = ifelse(is.factor(train@y), FALSE, TRUE), activate = "elu",
  n.hidden = c(10, 10), learning.rate = ifelse(learning.rate.adaptive %in%
  c("adam"), 0.001, 0.01), l1.reg = 0, l2.reg = 0, n.batch = 100,
  n.epoch = 100, early.stop = ifelse(is.null(validate), FALSE, TRUE),
  early.stop.det = 5, plot = FALSE, accel = c("rcpp", "gpu", "none")[3],
  learning.rate.adaptive = c("constant", "adadelta", "adagrad", "momentum",
  "adam")[2], rho = c(0.9, 0.95, 0.99, 0.999)[ifelse(learning.rate.adaptive ==
  "momentum", 1, 3)], epsilon = c(10^-10, 10^-8, 10^-6, 10^-4)[2],
  beta1 = 0.9, beta2 = 0.999, loss.f = ifelse(is.factor(train@y), "logit",
  "mse"), ...)

Arguments

train

A dnnetInput object, the training set.

validate

A dnnetInput object, the validation set, optional.

norm.x

A boolean variable indicating whether to normalize the input matrix.

norm.y

A boolean variable indicating whether to normalize the response (if continuous).

activate

Activation Function. One of the following, "sigmoid", "tanh", "relu", "prelu", "elu", "celu".

learning.rate

Initial learning rate, 0.001 by default; If "adam" is chosen as an adaptive learning rate adjustment method, 0.1 by defalut.

l1.reg

weight for l1 regularization, optional.

l2.reg

weight for l2 regularization, optional.

n.batch

Batch size for batch gradient descent.

n.epoch

Maximum number of epochs.

early.stop

Indicate whether early stop is used (only if there exists a validation set).

early.stop.det

Number of epochs of increasing loss to determine the early stop.

plot

Indicate whether to plot the loss.

accel

"rcpp" to use the Rcpp version and "none" (default) to use the R version for back propagation.

learning.rate.adaptive

Adaptive learning rate adjustment methods, one of the following, "constant", "adadelta", "adagrad", "momentum", "adam".

epsilon

A parameter used in Adagrad and Adam.

beta1

A parameter used in Adam.

beta2

A parameter used in Adam.

loss.f

Loss function of choice.

Value

Returns a DnnModelObj object.

See Also

dnnet-class
dnnetInput-class
actF


SkadiEye/dnnet documentation built on March 26, 2020, 8:13 a.m.