Translations:Attention Is All You Need/17/en
Since the model contains no recurrence, positional encodings are added to the input embeddings using sinusoidal functions of different frequencies:
Since the model contains no recurrence, positional encodings are added to the input embeddings using sinusoidal functions of different frequencies: