All translations

Enter a message name below to show all available translations.

Message

Found 3 translations.

NameCurrent message text
 h English (en)* Devlin, J., Chang, M.-W., Lee, K., & Toutanova, K. (2019). BERT: {{Term|pre-training}} of Deep Bidirectional {{Term|transformer|Transformers}} for Language Understanding. ''Proceedings of NAACL-HLT 2019''. [https://arxiv.org/abs/1810.04805 arXiv:1810.04805]
* Peters, M. E., Neumann, M., Iyyer, M., et al. (2018). Deep Contextualized Word Representations. ''NAACL 2018''.
* Radford, A., Narasimhan, K., Salimans, T., & Sutskever, I. (2018). Improving Language Understanding by Generative Pre-Training. ''OpenAI''.
 h Spanish (es)* Devlin, J., Chang, M.-W., Lee, K., & Toutanova, K. (2019). BERT: {{Term|pre-training|Pre-training}} of Deep Bidirectional {{Term|transformer|Transformers}} for Language Understanding. ''Proceedings of NAACL-HLT 2019''. [https://arxiv.org/abs/1810.04805 arXiv:1810.04805]
* Peters, M. E., Neumann, M., Iyyer, M., et al. (2018). Deep Contextualized Word Representations. ''NAACL 2018''.
* Radford, A., Narasimhan, K., Salimans, T., & Sutskever, I. (2018). Improving Language Understanding by Generative Pre-Training. ''OpenAI''.
 h Chinese (zh)* Devlin, J., Chang, M.-W., Lee, K., & Toutanova, K. (2019). BERT: {{Term|pre-training|Pre-training}} of Deep Bidirectional {{Term|transformer|Transformers}} for Language Understanding. ''Proceedings of NAACL-HLT 2019''. [https://arxiv.org/abs/1810.04805 arXiv:1810.04805]
* Peters, M. E., Neumann, M., Iyyer, M., et al. (2018). Deep Contextualized Word Representations. ''NAACL 2018''.
* Radford, A., Narasimhan, K., Salimans, T., & Sutskever, I. (2018). Improving Language Understanding by Generative Pre-Training. ''OpenAI''.