Translations:Attention Mechanisms/3/zh

    From Marovi AI
    Revision as of 03:21, 27 April 2026 by DeployBot (talk | contribs) (Batch translate Attention Mechanisms unit 3 → zh)
    (diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

    早期的序列到序列模型使用循环神经网络将整个输入序列编码为单个固定维度的向量。这种瓶颈迫使长程依赖被压缩到一个大小恒定的向量中,从而降低了在长序列上的性能。注意力通过让解码器在每个生成步骤都参考每个编码器隐藏状态,并根据学习到的相关性分数对它们进行加权,从而解决了这一问题。