Translations:Attention Is All You Need/9/en
The transformer follows an encoder-decoder structure. The encoder maps an input sequence of symbol representations to a sequence of continuous representations, and the decoder generates an output sequence one element at a time in an autoregressive fashion.