Translations:Overfitting and Regularization/34/en
- batch normalization — normalising layer inputs reduces internal covariate shift and has a mild regularizing effect.
- Label smoothing — replaces one-hot targets with a mixture, e.g. $ y_{\text{smooth}} = (1 - \epsilon)\, y + \epsilon / C $, preventing overconfidence.
- Noise injection — adding Gaussian noise to inputs, weights, or gradients during training.