Translations:Attention Mechanisms/37/zh

    From Marovi AI
    Revision as of 04:29, 28 April 2026 by DeployBot (talk | contribs) (Batch translate Attention Mechanisms unit 37 → zh)
    (diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
    • Bahdanau, D., Cho, K. and Bengio, Y. (2015). "neural machine translation by Jointly Learning to align and Translate". ICLR.
    • Luong, M.-T., Pham, H. and Manning, C. D. (2015). "Effective Approaches to Attention-based neural machine translation". EMNLP.
    • Vaswani, A. et al. (2017). "Attention Is All You Need". NeurIPS.
    • Shaw, P., Uszkoreit, J. and Vaswani, A. (2018). "Self-Attention with Relative Position Representations". NAACL.
    • Su, J. et al. (2021). "RoFormer: Enhanced transformer with Rotary Position embedding". arXiv:2104.09864.