activation_relu: Activation functions

Description Usage Arguments Details Value References

View source: R/activations.R

Description

Activations functions can either be used through layer_activation(), or through the activation argument supported by all forward layers.

Usage

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21

Arguments

x

Tensor

alpha

Alpha value

max_value

Max value

threshold

Threshold value for thresholded activation.

axis

Integer, axis along which the softmax normalization is applied

Details

Value

Tensor with the same shape and dtype as x.

References


dfalbel/keras documentation built on Nov. 27, 2019, 8:16 p.m.