Translations:Stochastic Gradient Descent/1/en: Difference between revisions
(Importing a new version from external source) |
(Importing a new version from external source) Tag: Manual revert |
||
| Line 1: | Line 1: | ||
'''Stochastic | '''Stochastic gradient descent''' (often abbreviated '''{{Term|SGD|SGD}}''') is an iterative optimisation algorithm used to minimise an {{Term|objective function|objective function}} written as a sum of differentiable sub-functions. It is the workhorse behind modern machine-learning training, powering everything from logistic regression to deep neural networks. | ||
Revision as of 22:03, 27 April 2026
Stochastic gradient descent (often abbreviated SGD) is an iterative optimisation algorithm used to minimise an objective function written as a sum of differentiable sub-functions. It is the workhorse behind modern machine-learning training, powering everything from logistic regression to deep neural networks.