Translations:Batch Normalization Accelerating Deep Network Training/23/en: Difference between revisions

    From Marovi AI
    (Importing a new version from external source)
     
    (Importing a new version from external source)
     
    Line 1: Line 1:
    Batch normalization became one of the most ubiquitous components in deep learning architectures. It was adopted almost universally in convolutional networks throughout the late 2010s and remains standard in many architectures. The technique's success inspired a family of normalization methods, including layer normalization (preferred in Transformers and recurrent networks), instance normalization (used in style transfer), and group normalization (useful for small batch sizes).
    {{Term|batch normalization}} became one of the most ubiquitous components in {{Term|deep learning}} architectures. It was adopted almost universally in convolutional networks throughout the late 2010s and remains standard in many architectures. The technique's success inspired a family of normalization methods, including {{Term|layer normalization}} (preferred in {{Term|transformer|Transformers}} and recurrent networks), instance normalization (used in style transfer), and group normalization (useful for small batch sizes).

    Latest revision as of 21:40, 27 April 2026

    Information about message (contribute)
    This message has no documentation. If you know where or how this message is used, you can help other translators by adding documentation to this message.
    Message definition (Batch Normalization Accelerating Deep Network Training)
    {{Term|batch normalization}} became one of the most ubiquitous components in {{Term|deep learning}} architectures. It was adopted almost universally in convolutional networks throughout the late 2010s and remains standard in many architectures. The technique's success inspired a family of normalization methods, including {{Term|layer normalization}} (preferred in {{Term|transformer|Transformers}} and recurrent networks), instance normalization (used in style transfer), and group normalization (useful for small batch sizes).

    batch normalization became one of the most ubiquitous components in deep learning architectures. It was adopted almost universally in convolutional networks throughout the late 2010s and remains standard in many architectures. The technique's success inspired a family of normalization methods, including layer normalization (preferred in Transformers and recurrent networks), instance normalization (used in style transfer), and group normalization (useful for small batch sizes).