ReLU: Rectified Linear Unit (ReLU) Activation Function

View source: R/relu.R

ReLUR Documentation

Rectified Linear Unit (ReLU) Activation Function

Description

This function applies the Rectified Linear Unit (ReLU) activation function to the input numeric vector. The ReLU function is defined as the positive part of its argument: f(x) = max(0, x).

Usage

ReLU(x)

Arguments

x

A numeric vector. All elements must be finite and non-missing.

Value

A numeric vector where the ReLU function has been applied to each element of x.

Author(s)

Matthew Fidler

See Also

Other Activation Functions: ELU(), GELU(), PReLU(), SELU(), Swish(), dELU(), dGELU(), dPReLU(), dReLU(), dSELU(), dSwish(), dlReLU(), dsoftplus(), lReLU(), softplus()

Examples


ReLU(c(-1, 0, 1, 2))

# Can also be used in rxode2:
x <- rxode2({
   r=ReLU(time)
})

e <- et(c(-1, 0, 1, 2))

rxSolve(x, e)


nlmixr2/rxode2 documentation built on Jan. 11, 2025, 8:48 a.m.