Translations:Overfitting and Regularization/22/en

    From Marovi AI
    Revision as of 23:34, 27 April 2026 by FuzzyBot (talk | contribs) (Importing a new version from external source)
    (diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

    dropout (Srivastava et al., 2014) is a regularization technique specific to neural networks. During training, each neuron is randomly "dropped" (set to zero) with probability $ p $ at each forward pass. This prevents neurons from co-adapting and forces the network to learn redundant representations.