Translations:Attention Mechanisms/1/zh: Difference between revisions

    From Marovi AI
    (Batch translate Attention Mechanisms unit 1 → zh)
    Tag: translation
     
    (Batch translate Attention Mechanisms unit 1 → zh)
    Tag: translation
    Line 1: Line 1:
    '''注意力机制'''是一类使神经网络在生成输出的每个元素时能够选择性地关注其输入相关部分的技术。注意力最初被提出是为了克服序列到序列模型中固定长度上下文向量的局限性,如今已成为[[Transformer]]等现代架构的基础构建模块。
    '''注意力机制'''是一类使神经网络在生成每个输出元素时能够有选择地聚焦于输入相关部分的技术。注意力最初被引入以克服序列到序列模型中固定长度上下文向量的局限性,如今已成为诸如 [[Transformer]] 等现代架构的基础构建模块。

    Revision as of 21:58, 27 April 2026

    Information about message (contribute)
    This message has no documentation. If you know where or how this message is used, you can help other translators by adding documentation to this message.
    Message definition (Attention Mechanisms)
    '''Attention mechanisms''' are a family of techniques that allow neural networks to focus selectively on relevant parts of their input when producing each element of the output. Originally introduced to overcome the limitations of fixed-length context vectors in {{Term|sequence-to-sequence}} models, attention has become the foundational building block of modern architectures such as the [[Transformer]].

    注意力机制是一类使神经网络在生成每个输出元素时能够有选择地聚焦于输入相关部分的技术。注意力最初被引入以克服序列到序列模型中固定长度上下文向量的局限性,如今已成为诸如 Transformer 等现代架构的基础构建模块。