Translations:BERT Pre-training of Deep Bidirectional Transformers/29/en

    From Marovi AI
    Revision as of 04:23, 28 April 2026 by FuzzyBot (talk | contribs) (Importing a new version from external source)
    (diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
    • Devlin, J., Chang, M.-W., Lee, K., & Toutanova, K. (2019). bert: pre-training of Deep Bidirectional Transformers for Language Understanding. Proceedings of NAACL-HLT 2019. arXiv:1810.04805
    • Peters, M. E., Neumann, M., Iyyer, M., et al. (2018). Deep Contextualized Word Representations. NAACL 2018.
    • Radford, A., Narasimhan, K., Salimans, T., & Sutskever, I. (2018). Improving Language Understanding by Generative Pre-Training. OpenAI.