Translations:Word Embeddings/41/en

    From Marovi AI
    Revision as of 00:31, 27 April 2026 by FuzzyBot (talk | contribs) (Importing a new version from external source)
    (diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
    • ELMo (Peters et al., 2018) — uses a bidirectional LSTM to generate context-dependent word representations.
    • BERT (Devlin et al., 2019) — uses a Transformer encoder trained with masked language modelling.
    • GPT series (Radford et al., 2018–) — uses a Transformer decoder trained autoregressively.