Translations:Attention Mechanisms/27/zh: Difference between revisions

    From Marovi AI
    (Batch translate Attention Mechanisms unit 27 → zh)
    Tag: translation
     
    (Batch translate Attention Mechanisms unit 27 → zh)
    Tag: translation
     
    (One intermediate revision by the same user not shown)
    Line 1: Line 1:
    由于自注意力对置换不变(它将输入视为无序集合),因此必须显式注入位置信息。原始 Transformer 使用正弦位置编码:
    由于自注意力是置换不变的(它将输入视为无序集合),位置信息必须显式注入。原始 {{Term|transformer|Transformer}} 使用正弦编码:

    Latest revision as of 23:36, 27 April 2026

    Information about message (contribute)
    This message has no documentation. If you know where or how this message is used, you can help other translators by adding documentation to this message.
    Message definition (Attention Mechanisms)
    Because self-attention is permutation-invariant (it treats the input as an unordered set), positional information must be injected explicitly. The original {{Term|transformer}} uses sinusoidal encodings:

    由于自注意力是置换不变的(它将输入视为无序集合),位置信息必须显式注入。原始 Transformer 使用正弦编码: