All translations
Enter a message name below to show all available translations.
Found 3 translations.
| Name | Current message text |
|---|---|
| h English (en) | Modern {{Term|deep learning}} frameworks (PyTorch, TensorFlow, JAX) implement backpropagation by constructing a '''computational graph''' — a directed acyclic graph where each node represents an operation and each edge carries a {{Term|tensor}}. The forward pass builds the graph; the backward pass traverses it in reverse topological order, applying the chain rule at every node. |
| h Spanish (es) | Los frameworks modernos de {{Term|deep learning|aprendizaje profundo}} (PyTorch, TensorFlow, JAX) implementan la retropropagación construyendo un '''grafo computacional''': un grafo dirigido acíclico donde cada nodo representa una operación y cada arista transporta un {{Term|tensor|tensor}}. El paso hacia adelante construye el grafo; el paso hacia atrás lo recorre en orden topológico inverso, aplicando la regla de la cadena en cada nodo. |
| h Chinese (zh) | 现代{{Term|deep learning|深度学习}}框架(PyTorch、TensorFlow、JAX)通过构建'''计算图'''来实现反向传播——这是一种有向无环图,其中每个节点代表一个操作,每条边传递一个{{Term|tensor|张量}}。前向传播构建该图;反向传播按反向拓扑顺序遍历它,并在每个节点应用链式法则。 |