Translations:Overfitting and Regularization/34/en: Difference between revisions

    From Marovi AI
    (Importing a new version from external source)
    Tag: Manual revert
    (Importing a new version from external source)
    Tag: Manual revert
     
    Line 1: Line 1:
    * '''Batch normalization''' — normalising layer inputs reduces internal covariate shift and has a mild regularizing effect.
    * '''{{Term|batch normalization}}''' — normalising layer inputs reduces internal covariate shift and has a mild regularizing effect.
    * '''Label smoothing''' — replaces one-hot targets with a mixture, e.g. <math>y_{\text{smooth}} = (1 - \epsilon)\, y + \epsilon / C</math>, preventing overconfidence.
    * '''Label smoothing''' — replaces {{Term|one-hot encoding|one-hot}} targets with a mixture, e.g. <math>y_{\text{smooth}} = (1 - \epsilon)\, y + \epsilon / C</math>, preventing overconfidence.
    * '''Noise injection''' — adding Gaussian noise to inputs, weights, or gradients during training.
    * '''Noise injection''' — adding Gaussian noise to inputs, weights, or gradients during training.

    Latest revision as of 23:34, 27 April 2026

    Information about message (contribute)
    This message has no documentation. If you know where or how this message is used, you can help other translators by adding documentation to this message.
    Message definition (Overfitting and Regularization)
    * '''{{Term|batch normalization}}''' — normalising layer inputs reduces internal covariate shift and has a mild regularizing effect.
    * '''Label smoothing''' — replaces {{Term|one-hot encoding|one-hot}} targets with a mixture, e.g. <math>y_{\text{smooth}} = (1 - \epsilon)\, y + \epsilon / C</math>, preventing overconfidence.
    * '''Noise injection''' — adding Gaussian noise to inputs, weights, or gradients during training.
    • batch normalization — normalising layer inputs reduces internal covariate shift and has a mild regularizing effect.
    • Label smoothing — replaces one-hot targets with a mixture, e.g. $ y_{\text{smooth}} = (1 - \epsilon)\, y + \epsilon / C $, preventing overconfidence.
    • Noise injection — adding Gaussian noise to inputs, weights, or gradients during training.