<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://marovi.ai/index.php?action=history&amp;feed=atom&amp;title=Translations%3AAttention_Mechanisms%2F1%2Fen</id>
	<title>Translations:Attention Mechanisms/1/en - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://marovi.ai/index.php?action=history&amp;feed=atom&amp;title=Translations%3AAttention_Mechanisms%2F1%2Fen"/>
	<link rel="alternate" type="text/html" href="https://marovi.ai/index.php?title=Translations:Attention_Mechanisms/1/en&amp;action=history"/>
	<updated>2026-04-28T00:44:38Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.39.1</generator>
	<entry>
		<id>https://marovi.ai/index.php?title=Translations:Attention_Mechanisms/1/en&amp;diff=17507&amp;oldid=prev</id>
		<title>FuzzyBot: Importing a new version from external source</title>
		<link rel="alternate" type="text/html" href="https://marovi.ai/index.php?title=Translations:Attention_Mechanisms/1/en&amp;diff=17507&amp;oldid=prev"/>
		<updated>2026-04-27T23:33:26Z</updated>

		<summary type="html">&lt;p&gt;Importing a new version from external source&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 23:33, 27 April 2026&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l1&quot;&gt;Line 1:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 1:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Attention mechanisms&amp;#039;&amp;#039;&amp;#039; are a family of techniques that allow neural networks to focus selectively on relevant parts of their input when producing each element of the output. Originally introduced to overcome the limitations of fixed-length context vectors in sequence-to-sequence models, attention has become the foundational building block of modern architectures such as the [[Transformer]].&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Attention mechanisms&amp;#039;&amp;#039;&amp;#039; are a family of techniques that allow neural networks to focus selectively on relevant parts of their input when producing each element of the output. Originally introduced to overcome the limitations of fixed-length context vectors in &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;{{Term|&lt;/ins&gt;sequence-to-sequence&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;}} &lt;/ins&gt;models, attention has become the foundational building block of modern architectures such as the [[Transformer]].&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>FuzzyBot</name></author>
	</entry>
	<entry>
		<id>https://marovi.ai/index.php?title=Translations:Attention_Mechanisms/1/en&amp;diff=14228&amp;oldid=prev</id>
		<title>FuzzyBot: Importing a new version from external source</title>
		<link rel="alternate" type="text/html" href="https://marovi.ai/index.php?title=Translations:Attention_Mechanisms/1/en&amp;diff=14228&amp;oldid=prev"/>
		<updated>2026-04-27T21:57:35Z</updated>

		<summary type="html">&lt;p&gt;Importing a new version from external source&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 21:57, 27 April 2026&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l1&quot;&gt;Line 1:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 1:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Attention mechanisms&amp;#039;&amp;#039;&amp;#039; are a family of techniques that allow neural networks to focus selectively on relevant parts of their input when producing each element of the output. Originally introduced to overcome the limitations of fixed-length context vectors in &lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;{{Term|&lt;/del&gt;sequence-to-sequence&lt;del style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;}} &lt;/del&gt;models, attention has become the foundational building block of modern architectures such as the [[Transformer]].&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Attention mechanisms&amp;#039;&amp;#039;&amp;#039; are a family of techniques that allow neural networks to focus selectively on relevant parts of their input when producing each element of the output. Originally introduced to overcome the limitations of fixed-length context vectors in sequence-to-sequence models, attention has become the foundational building block of modern architectures such as the [[Transformer]].&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;

&lt;!-- diff cache key mediawiki:diff::1.12:old-12997:rev-14228 --&gt;
&lt;/table&gt;</summary>
		<author><name>FuzzyBot</name></author>
	</entry>
	<entry>
		<id>https://marovi.ai/index.php?title=Translations:Attention_Mechanisms/1/en&amp;diff=12997&amp;oldid=prev</id>
		<title>FuzzyBot: Importing a new version from external source</title>
		<link rel="alternate" type="text/html" href="https://marovi.ai/index.php?title=Translations:Attention_Mechanisms/1/en&amp;diff=12997&amp;oldid=prev"/>
		<updated>2026-04-27T19:41:38Z</updated>

		<summary type="html">&lt;p&gt;Importing a new version from external source&lt;/p&gt;
&lt;table style=&quot;background-color: #fff; color: #202122;&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #202122; text-align: center;&quot;&gt;Revision as of 19:41, 27 April 2026&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l1&quot;&gt;Line 1:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 1:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;−&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Attention mechanisms&amp;#039;&amp;#039;&amp;#039; are a family of techniques that allow neural networks to focus selectively on relevant parts of their input when producing each element of the output. Originally introduced to overcome the limitations of fixed-length context vectors in sequence-to-sequence models, attention has become the foundational building block of modern architectures such as the [[Transformer]].&lt;/div&gt;&lt;/td&gt;&lt;td class=&quot;diff-marker&quot; data-marker=&quot;+&quot;&gt;&lt;/td&gt;&lt;td style=&quot;color: #202122; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Attention mechanisms&amp;#039;&amp;#039;&amp;#039; are a family of techniques that allow neural networks to focus selectively on relevant parts of their input when producing each element of the output. Originally introduced to overcome the limitations of fixed-length context vectors in &lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;{{Term|&lt;/ins&gt;sequence-to-sequence&lt;ins style=&quot;font-weight: bold; text-decoration: none;&quot;&gt;}} &lt;/ins&gt;models, attention has become the foundational building block of modern architectures such as the [[Transformer]].&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;

&lt;!-- diff cache key mediawiki:diff::1.12:old-2373:rev-12997 --&gt;
&lt;/table&gt;</summary>
		<author><name>FuzzyBot</name></author>
	</entry>
	<entry>
		<id>https://marovi.ai/index.php?title=Translations:Attention_Mechanisms/1/en&amp;diff=2373&amp;oldid=prev</id>
		<title>FuzzyBot: Importing a new version from external source</title>
		<link rel="alternate" type="text/html" href="https://marovi.ai/index.php?title=Translations:Attention_Mechanisms/1/en&amp;diff=2373&amp;oldid=prev"/>
		<updated>2026-04-27T00:30:19Z</updated>

		<summary type="html">&lt;p&gt;Importing a new version from external source&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;&amp;#039;&amp;#039;&amp;#039;Attention mechanisms&amp;#039;&amp;#039;&amp;#039; are a family of techniques that allow neural networks to focus selectively on relevant parts of their input when producing each element of the output. Originally introduced to overcome the limitations of fixed-length context vectors in sequence-to-sequence models, attention has become the foundational building block of modern architectures such as the [[Transformer]].&lt;/div&gt;</summary>
		<author><name>FuzzyBot</name></author>
	</entry>
</feed>