Translations:Overfitting and Regularization/22/en: Difference between revisions

    From Marovi AI
    (Importing a new version from external source)
    (Importing a new version from external source)
    Tag: Manual revert
    Line 1: Line 1:
    '''{{Term|dropout}}''' (Srivastava et al., 2014) is a regularization technique specific to neural networks. During training, each neuron is randomly "dropped" (set to zero) with probability <math>p</math> at each forward pass. This prevents neurons from co-adapting and forces the network to learn redundant representations.
    '''Dropout''' (Srivastava et al., 2014) is a regularization technique specific to neural networks. During training, each neuron is randomly "dropped" (set to zero) with probability <math>p</math> at each forward pass. This prevents neurons from co-adapting and forces the network to learn redundant representations.

    Revision as of 22:02, 27 April 2026

    Information about message (contribute)
    This message has no documentation. If you know where or how this message is used, you can help other translators by adding documentation to this message.
    Message definition (Overfitting and Regularization)
    '''{{Term|dropout}}''' (Srivastava et al., 2014) is a regularization technique specific to neural networks. During training, each neuron is randomly "dropped" (set to zero) with probability <math>p</math> at each forward pass. This prevents neurons from co-adapting and forces the network to learn redundant representations.

    Dropout (Srivastava et al., 2014) is a regularization technique specific to neural networks. During training, each neuron is randomly "dropped" (set to zero) with probability $ p $ at each forward pass. This prevents neurons from co-adapting and forces the network to learn redundant representations.