dropout: Dropout layer

View source: R/layers.R

dropoutR Documentation

Dropout layer

Description

Randomly sets a fraction rate of input units to 0 at each update during training time, which helps prevent overfitting.

Usage

dropout(rate = 0.5)

Arguments

rate

The fraction of affected units

Value

A construct of class "ruta_network"

See Also

Other neural layers: conv(), dense(), input(), layer_keras(), output(), variational_block()


fdavidcl/ruta documentation built on July 5, 2023, 6:32 p.m.