This function normalizes the distribution of inputs to hidden layers in a neural network

1 2 |

`x` |
weighted sum of outputs from the previous layer |

`gamma` |
the gamma coefficient |

`beta` |
the beta coefficient |

`mu` |
the mean of the input neurons. If NULL, it will be caluclated in the function. |

`sigma_2` |
the variance of the input nerurons. If NULL, it will be calcualted in the function. |

`epsilon` |
a constant added to the variance for numerical stability |

Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift Sergey Ioffe, Christian Szegedy

http://jmlr.org/proceedings/papers/v37/ioffe15.pdf Pg 4

Questions? Problems? Suggestions? Tweet to @rdrrHQ or email at ian@mutexlabs.com.

Please suggest features or report bugs with the GitHub issue tracker.

All documentation is copyright its authors; we didn't write any of that.