Translations:Attention Mechanisms/27/zh: Difference between revisions

    From Marovi AI
    (Batch translate Attention Mechanisms unit 27 → zh)
    Tag: translation
     
    (Batch translate Attention Mechanisms unit 27 → zh)
    Tag: translation
    Line 1: Line 1:
    由于自注意力对置换不变(它将输入视为无序集合),因此必须显式注入位置信息。原始 Transformer 使用正弦位置编码:
    由于自注意力是排列不变的(它将输入视为无序集合),必须显式注入位置信息。原始 Transformer 使用正弦位置编码:

    Revision as of 21:58, 27 April 2026

    Information about message (contribute)
    This message has no documentation. If you know where or how this message is used, you can help other translators by adding documentation to this message.
    Message definition (Attention Mechanisms)
    Because self-attention is permutation-invariant (it treats the input as an unordered set), positional information must be injected explicitly. The original {{Term|transformer}} uses sinusoidal encodings:

    由于自注意力是排列不变的(它将输入视为无序集合),必须显式注入位置信息。原始 Transformer 使用正弦位置编码: