ReLU: Rectified Linear Unit activation derivative function

Description Usage Arguments Value Examples

View source: R/neuralnetwork.R

Description

Rectified Linear Unit activation derivative function

Usage

1
ReLU(x)

Arguments

x

numeric vector.

Value

ReLUed arguments.

Examples

1
2
ReLU(c(3, 5, -2, 0))
ReLU(matrix(-2:3, 2))

wiper8/AI documentation built on Dec. 23, 2021, 5:15 p.m.