All translations
Enter a message name below to show all available translations.
Found 3 translations.
| Name | Current message text |
|---|---|
| h English (en) | * '''ELMo''' (Peters et al., 2018) — uses a bidirectional {{Term|long short-term memory|LSTM}} to generate context-dependent word representations. * '''BERT''' (Devlin et al., 2019) — uses a {{Term|transformer}} encoder trained with masked language modelling. * '''GPT''' series (Radford et al., 2018–) — uses a {{Term|transformer}} decoder trained autoregressively. |
| h Spanish (es) | * '''ELMo''' (Peters et al., 2018) — utiliza una {{Term|long short-term memory|LSTM}} bidireccional para generar representaciones de palabras dependientes del contexto. * '''BERT''' (Devlin et al., 2019) — utiliza un codificador {{Term|transformer|transformer}} entrenado con modelado de lenguaje enmascarado. * Serie '''GPT''' (Radford et al., 2018–) — utiliza un decodificador {{Term|transformer|transformer}} entrenado de forma autorregresiva. |
| h Chinese (zh) | * '''ELMo'''(Peters 等,2018)— 使用双向 {{Term|long short-term memory|LSTM}} 生成依赖上下文的词表示。 * '''BERT'''(Devlin 等,2019)— 使用以掩码语言建模训练的 {{Term|transformer|Transformer}} 编码器。 * '''GPT''' 系列(Radford 等,2018–)— 使用以自回归方式训练的 {{Term|transformer|Transformer}} 解码器。 |