Translations:Batch Normalization Accelerating Deep Network Training/7/en

    From Marovi AI
    • batch normalization: A method that normalizes each scalar feature independently over the mini-batch, using learnable scale and shift parameters to preserve representational capacity.
    • Internal covariate shift hypothesis: Identification and formalization of the problem of shifting input distributions during training as a contributor to optimization difficulty.
    • Training acceleration: Demonstration that batch normalization enables 14x faster convergence to the same accuracy level on ImageNet, and allows the use of much higher learning rates.
    • Inception-BN architecture: A batch-normalized version of GoogLeNet (Inception) that exceeded the original's accuracy and approached human-level performance on ImageNet.