All translations

Enter a message name below to show all available translations.

Message

Found 3 translations.

NameCurrent message text
 h English (en){{Term|bert}} addressed this limitation by introducing a novel {{Term|pre-training}} objective — '''{{Term|masked language model|masked language modeling}}''' ({{Term|masked language model|MLM}}) — that enables genuine bidirectional {{Term|pre-training}}. Combined with a '''{{Term|next sentence prediction}}''' ({{Term|next sentence prediction|NSP}}) task, {{Term|bert}} learned rich {{Term|contextual embedding|contextual representations}} that could be transferred to downstream tasks through simple {{Term|fine-tuning}}, eliminating the need for task-specific architectures.
 h Spanish (es){{Term|bert|BERT}} abordó esta limitación introduciendo un nuevo objetivo de {{Term|pre-training|preentrenamiento}} — el '''{{Term|masked language model|modelado de lenguaje enmascarado}}''' ({{Term|masked language model|MLM}}) — que permite un {{Term|pre-training|preentrenamiento}} bidireccional genuino. Combinado con una tarea de '''{{Term|next sentence prediction|predicción de la siguiente oración}}''' ({{Term|next sentence prediction|NSP}}), {{Term|bert|BERT}} aprendió ricas {{Term|contextual embedding|representaciones contextuales}} que podían transferirse a tareas posteriores mediante un simple {{Term|fine-tuning|ajuste fino}}, eliminando la necesidad de arquitecturas específicas de cada tarea.
 h Chinese (zh){{Term|bert|BERT}} 通过引入一种新颖的 {{Term|pre-training|预训练}} 目标——'''{{Term|masked language model|掩码语言建模}}'''({{Term|masked language model|MLM}})——解决了这一限制,从而实现了真正的双向 {{Term|pre-training|预训练}}。结合 '''{{Term|next sentence prediction|下一句预测}}'''({{Term|next sentence prediction|NSP}})任务,{{Term|bert|BERT}} 学到了丰富的 {{Term|contextual embedding|上下文表示}},这些表示可以通过简单的 {{Term|fine-tuning|微调}} 迁移到下游任务,消除了对任务特定架构的需求。