Translations:Overfitting and Regularization/22/en: Difference between revisions
(Importing a new version from external source) |
(Importing a new version from external source) Tag: Manual revert |
||
| Line 1: | Line 1: | ||
''' | '''Dropout''' (Srivastava et al., 2014) is a regularization technique specific to neural networks. During training, each neuron is randomly "dropped" (set to zero) with probability <math>p</math> at each forward pass. This prevents neurons from co-adapting and forces the network to learn redundant representations. | ||
Revision as of 22:02, 27 April 2026
Dropout (Srivastava et al., 2014) is a regularization technique specific to neural networks. During training, each neuron is randomly "dropped" (set to zero) with probability $ p $ at each forward pass. This prevents neurons from co-adapting and forces the network to learn redundant representations.