Translations:Attention Mechanisms/27/es

    From Marovi AI
    Revision as of 23:36, 27 April 2026 by DeployBot (talk | contribs) (Batch translate Attention Mechanisms unit 27 → es)
    (diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

    Dado que la autoatención es invariante a permutaciones (trata la entrada como un conjunto desordenado), la información posicional debe inyectarse explícitamente. El transformer original utiliza codificaciones sinusoidales: