batch_normalization: Batch Normalization Function that normalizes the input before...

Description Usage Arguments References See Also

Description

This function normalizes the distribution of inputs to hidden layers in a neural network

Usage

1
2
batch_normalization(x, gamma, beta, mu = NULL, sigma_2 = NULL,
  epsilon = exp(-12))

Arguments

x

weighted sum of outputs from the previous layer

gamma

the gamma coefficient

beta

the beta coefficient

mu

the mean of the input neurons. If NULL, it will be caluclated in the function.

sigma_2

the variance of the input nerurons. If NULL, it will be calcualted in the function.

epsilon

a constant added to the variance for numerical stability

References

Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift Sergey Ioffe, Christian Szegedy

See Also

http://jmlr.org/proceedings/papers/v37/ioffe15.pdf Pg 4



Search within the deeplearning package
Search all R packages, documentation and source code

Questions? Problems? Suggestions? or email at ian@mutexlabs.com.

Please suggest features or report bugs with the GitHub issue tracker.

All documentation is copyright its authors; we didn't write any of that.