All translations

Enter a message name below to show all available translations.

Message

Found 3 translations.

NameCurrent message text
 h English (en)* '''Word2Vec / GloVe''' — static word {{Term|embedding|embeddings}} pretrained on large corpora.
* '''ELMo''' — contextualised {{Term|embedding|embeddings}} from bidirectional {{Term|long short-term memory|LSTMs}}.
* '''BERT''' (Devlin et al., 2019) — bidirectional {{Term|transformer}} pretrained with masked language modelling; fine-tuned for classification, QA, NER, and more.
* '''GPT series''' — autoregressive {{Term|transformer|Transformers}} demonstrating that scale and {{Term|pre-training|pretraining}} enable few-shot and zero-shot transfer.
 h Spanish (es)* '''Word2Vec / GloVe''' — {{Term|embedding|embeddings}} de palabras estáticos preentrenados en grandes corpus.
* '''ELMo''' — {{Term|embedding|embeddings}} contextualizados a partir de {{Term|long short-term memory|LSTMs}} bidireccionales.
* '''BERT''' (Devlin et al., 2019) — {{Term|transformer|transformer}} bidireccional preentrenado con modelado de lenguaje enmascarado; ajustado para clasificación, QA, NER y más.
* '''Serie GPT''' — {{Term|transformer|Transformers}} autorregresivos que demuestran que la escala y el {{Term|pre-training|preentrenamiento}} permiten la transferencia con pocos ejemplos y sin ejemplos.
 h Chinese (zh)* '''Word2Vec / GloVe''' — 在大型语料库上预训练的静态词{{Term|embedding|嵌入}}。
* '''ELMo''' — 来自双向 {{Term|long short-term memory|LSTM}} 的上下文化{{Term|embedding|嵌入}}。
* '''BERT'''(Devlin 等,2019)— 通过掩码语言建模预训练的双向 {{Term|transformer|Transformer}};可针对分类、问答、命名实体识别等任务进行微调。
* '''GPT 系列''' — 自回归 {{Term|transformer|Transformer}},证明了规模与{{Term|pre-training|预训练}}能够实现少样本和零样本迁移。