LReLU: Leaky Rectified Linear Unit activation derivative function

Description Usage Arguments Value Examples

View source: R/neuralnetwork.R

Description

Leaky Rectified Linear Unit activation derivative function

Usage

1
LReLU(x, a = 0.1)

Arguments

x

numeric vector.

Value

Leaky ReLUed arguments.

Examples

1
2
LReLU(c(3, 5, -2, 0))
LReLU(matrix(-2:3, 2))

wiper8/AI documentation built on Dec. 23, 2021, 5:15 p.m.