All translations

Enter a message name below to show all available translations.

Message

Found 3 translations.

NameCurrent message text
 h English (en)'''{{Term|attention}} Is All You Need''' is a landmark 2017 paper by Vaswani et al. that introduced the '''{{Term|transformer}}''' architecture, a novel neural network design based entirely on {{Term|attention}} mechanisms. The paper demonstrated that recurrent and {{Term|convolution|convolutional layers}}, previously considered essential for {{Term|sequence-to-sequence}} tasks, could be replaced by self-{{Term|attention}}, yielding superior performance and dramatically improved training efficiency.
 h Spanish (es)'''{{Term|attention|Attention}} Is All You Need''' es un artículo emblemático de 2017 de Vaswani et al. que introdujo la arquitectura '''{{Term|transformer}}''', un novedoso diseño de red neuronal basado enteramente en mecanismos de {{Term|attention|atención}}. El artículo demostró que las capas recurrentes y las {{Term|convolution|capas convolucionales}}, anteriormente consideradas esenciales para tareas {{Term|sequence-to-sequence|secuencia a secuencia}}, podían reemplazarse por self-{{Term|attention|atención}}, obteniendo un rendimiento superior y una eficiencia de entrenamiento drásticamente mejorada.
 h Chinese (zh)'''{{Term|attention|Attention}} Is All You Need''' 是 Vaswani 等人于 2017 年发表的里程碑式论文,引入了 '''{{Term|transformer}}''' 架构——一种完全基于 {{Term|attention|注意力}} 机制的新颖神经网络设计。该论文证明了之前被认为对 {{Term|sequence-to-sequence|序列到序列}} 任务至关重要的循环层和 {{Term|convolution|卷积层}}可以被自 {{Term|attention|注意力}} 所替代,从而获得更优的性能并大幅提升训练效率。