batch_normalization: Batch Normalization Function that normalizes the input before...

Description Usage Arguments References See Also

View source: R/batch_normalization.R

Description

This function normalizes the distribution of inputs to hidden layers in a neural network

Usage

1
2
batch_normalization(x, gamma, beta, mu = NULL, sigma_2 = NULL,
  epsilon = exp(-12))

Arguments

x

weighted sum of outputs from the previous layer

gamma

the gamma coefficient

beta

the beta coefficient

mu

the mean of the input neurons. If NULL, it will be caluclated in the function.

sigma_2

the variance of the input nerurons. If NULL, it will be calcualted in the function.

epsilon

a constant added to the variance for numerical stability

References

Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift Sergey Ioffe, Christian Szegedy

See Also

http://jmlr.org/proceedings/papers/v37/ioffe15.pdf Pg 4


rz1988/deeplearning documentation built on May 28, 2019, 10:46 a.m.