dReLU: Derivative of the Rectified Linear Unit (ReLU) Activation...

View source: R/relu.R

dReLUR Documentation

Derivative of the Rectified Linear Unit (ReLU) Activation Function

Description

This function applies the derivative of the Rectified Linear Unit (ReLU) activation function to the input numeric vector.

Usage

dReLU(x)

Arguments

x

A numeric vector. All elements must be finite and non-missing.

Value

A numeric vector where the derivative of the ReLU function

See Also

Other Activation Functions: ELU(), GELU(), PReLU(), ReLU(), SELU(), Swish(), dELU(), dGELU(), dPReLU(), dSELU(), dSwish(), dlReLU(), dsoftplus(), lReLU(), softplus()

Examples


dReLU(c(-1, 0, 1, 2))

# Can also be used in rxode2:
x <- rxode2({
   r=dReLU(time)
})

e <- et(c(-1, 0, 1, 2))

rxSolve(x, e)

nlmixr2/rxode2 documentation built on Jan. 11, 2025, 8:48 a.m.