Translations:Attention Mechanisms/37/en: Difference between revisions

    From Marovi AI
    (Importing a new version from external source)
    (Importing a new version from external source)
    Tag: Manual revert
     
    (One intermediate revision by the same user not shown)
    (No difference)

    Latest revision as of 23:33, 27 April 2026

    Information about message (contribute)
    This message has no documentation. If you know where or how this message is used, you can help other translators by adding documentation to this message.
    Message definition (Attention Mechanisms)
    * Bahdanau, D., Cho, K. and Bengio, Y. (2015). "Neural Machine Translation by Jointly Learning to Align and Translate". ''ICLR''.
    * Luong, M.-T., Pham, H. and Manning, C. D. (2015). "Effective Approaches to Attention-based Neural Machine Translation". ''EMNLP''.
    * Vaswani, A. et al. (2017). "Attention Is All You Need". ''NeurIPS''.
    * Shaw, P., Uszkoreit, J. and Vaswani, A. (2018). "Self-Attention with Relative Position Representations". ''NAACL''.
    * Su, J. et al. (2021). "RoFormer: Enhanced {{Term|transformer}} with Rotary Position {{Term|embedding}}". ''arXiv:2104.09864''.
    • Bahdanau, D., Cho, K. and Bengio, Y. (2015). "Neural Machine Translation by Jointly Learning to Align and Translate". ICLR.
    • Luong, M.-T., Pham, H. and Manning, C. D. (2015). "Effective Approaches to Attention-based Neural Machine Translation". EMNLP.
    • Vaswani, A. et al. (2017). "Attention Is All You Need". NeurIPS.
    • Shaw, P., Uszkoreit, J. and Vaswani, A. (2018). "Self-Attention with Relative Position Representations". NAACL.
    • Su, J. et al. (2021). "RoFormer: Enhanced transformer with Rotary Position embedding". arXiv:2104.09864.