Translations:Overfitting and Regularization/22/en

    From Marovi AI

    Dropout (Srivastava et al., 2014) is a regularization technique specific to neural networks. During training, each neuron is randomly "dropped" (set to zero) with probability $ p $ at each forward pass. This prevents neurons from co-adapting and forces the network to learn redundant representations.