Translations:Overfitting and Regularization/22/en

    From Marovi AI
    Revision as of 19:42, 27 April 2026 by FuzzyBot (talk | contribs) (Importing a new version from external source)

    dropout (Srivastava et al., 2014) is a regularization technique specific to neural networks. During training, each neuron is randomly "dropped" (set to zero) with probability $ p $ at each forward pass. This prevents neurons from co-adapting and forces the network to learn redundant representations.