Translations:Batch Normalization Accelerating Deep Network Training/23/en

    From Marovi AI

    batch normalization became one of the most ubiquitous components in deep learning architectures. It was adopted almost universally in convolutional networks throughout the late 2010s and remains standard in many architectures. The technique's success inspired a family of normalization methods, including layer normalization (preferred in Transformers and recurrent networks), instance normalization (used in style transfer), and group normalization (useful for small batch sizes).