Translations:Attention Mechanisms/1/zh

    From Marovi AI
    Revision as of 23:36, 27 April 2026 by DeployBot (talk | contribs) (Batch translate Attention Mechanisms unit 1 → zh)
    (diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

    注意力機制是一類允許神經網絡在生成輸出的每個元素時有選擇地關注輸入相關部分的技術。最初引入是為了克服序列到序列模型中固定長度上下文向量的局限性,如今注意力已成為 Transformer 等現代架構的基礎構建模塊。