All translations
Enter a message name below to show all available translations.
Found 3 translations.
| Name | Current message text |
|---|---|
| h English (en) | '''BERT: {{Term|pre-training}} of Deep Bidirectional {{Term|transformer|Transformers}} for Language Understanding''' is a 2019 paper by Devlin et al. from Google AI Language that introduced '''BERT''' (Bidirectional Encoder Representations from {{Term|transformer|Transformers}}), a method for {{Term|pre-training}} deep bidirectional language representations. BERT revolutionized NLP by demonstrating that a single pre-trained model could be fine-tuned to achieve state-of-the-art results on a wide range of downstream tasks with minimal task-specific architecture modifications. |
| h Spanish (es) | '''BERT: {{Term|pre-training|preentrenamiento}} de {{Term|transformer|Transformers}} Bidireccionales Profundos para la Comprensión del Lenguaje''' es un artículo de 2019 de Devlin et al. de Google AI Language que introdujo '''BERT''' (Representaciones de Codificador Bidireccional a partir de {{Term|transformer|Transformers}}), un método para el {{Term|pre-training|preentrenamiento}} de representaciones lingüísticas bidireccionales profundas. BERT revolucionó el PLN al demostrar que un único modelo preentrenado podía ajustarse para lograr resultados de vanguardia en una amplia variedad de tareas posteriores con modificaciones arquitectónicas mínimas específicas a cada tarea. |
| h Chinese (zh) | '''BERT:用于语言理解的深度双向 {{Term|transformer|Transformers}} {{Term|pre-training|预训练}}''' 是 Devlin 等人于 2019 年在 Google AI Language 发表的论文,该论文提出了 '''BERT'''(基于 {{Term|transformer|Transformers}} 的双向编码器表征),一种用于 {{Term|pre-training|预训练}} 深度双向语言表征的方法。BERT 通过证明单个预训练模型可以通过微调,在各种下游任务上以极少的任务特定架构修改实现最先进的结果,从而彻底改变了自然语言处理领域。 |