Translations:Word Embeddings/41/en: Difference between revisions
(Importing a new version from external source) |
(Importing a new version from external source) Tag: Manual revert |
||
| Line 1: | Line 1: | ||
* '''ELMo''' (Peters et al., 2018) — uses a bidirectional | * '''ELMo''' (Peters et al., 2018) — uses a bidirectional LSTM to generate context-dependent word representations. | ||
* '''BERT''' (Devlin et al., 2019) — uses a | * '''BERT''' (Devlin et al., 2019) — uses a Transformer encoder trained with masked language modelling. | ||
* '''GPT''' series (Radford et al., 2018–) — uses a | * '''GPT''' series (Radford et al., 2018–) — uses a Transformer decoder trained autoregressively. | ||
Revision as of 22:01, 27 April 2026
- ELMo (Peters et al., 2018) — uses a bidirectional LSTM to generate context-dependent word representations.
- BERT (Devlin et al., 2019) — uses a Transformer encoder trained with masked language modelling.
- GPT series (Radford et al., 2018–) — uses a Transformer decoder trained autoregressively.