Translations:Attention Mechanisms/37/es: Difference between revisions
(Batch translate Attention Mechanisms unit 37 → es) Tag: translation |
(Batch translate Attention Mechanisms unit 37 → es) Tag: translation |
||
| (One intermediate revision by the same user not shown) | |||
| Line 1: | Line 1: | ||
* Bahdanau, D., Cho, K. | * Bahdanau, D., Cho, K. and Bengio, Y. (2015). "Neural Machine Translation by Jointly Learning to Align and Translate". ''ICLR''. | ||
* Luong, M.-T., Pham, H. | * Luong, M.-T., Pham, H. and Manning, C. D. (2015). "Effective Approaches to Attention-based Neural Machine Translation". ''EMNLP''. | ||
* Vaswani, A. et al. (2017). "Attention Is All You Need". ''NeurIPS''. | * Vaswani, A. et al. (2017). "Attention Is All You Need". ''NeurIPS''. | ||
* Shaw, P., Uszkoreit, J. | * Shaw, P., Uszkoreit, J. and Vaswani, A. (2018). "Self-Attention with Relative Position Representations". ''NAACL''. | ||
* Su, J. et al. (2021). "RoFormer: Enhanced | * Su, J. et al. (2021). "RoFormer: Enhanced {{Term|transformer|transformer}} with Rotary Position {{Term|embedding|embedding}}". ''arXiv:2104.09864''. | ||
Latest revision as of 23:36, 27 April 2026
- Bahdanau, D., Cho, K. and Bengio, Y. (2015). "Neural Machine Translation by Jointly Learning to Align and Translate". ICLR.
- Luong, M.-T., Pham, H. and Manning, C. D. (2015). "Effective Approaches to Attention-based Neural Machine Translation". EMNLP.
- Vaswani, A. et al. (2017). "Attention Is All You Need". NeurIPS.
- Shaw, P., Uszkoreit, J. and Vaswani, A. (2018). "Self-Attention with Relative Position Representations". NAACL.
- Su, J. et al. (2021). "RoFormer: Enhanced transformer with Rotary Position embedding". arXiv:2104.09864.