Translations:Batch Normalization Accelerating Deep Network Training/17/en
The authors also observed that batch normalization reduces the dependence on precise initialization, permits higher learning rates without divergence, and provides a mild regularization effect because each sample's normalized value depends on the other samples in its mini-batch, introducing stochastic noise.