Translations:Dropout A Simple Way to Prevent Overfitting/4/en

    From Marovi AI

    Deep neural networks with many parameters are powerful function approximators but are prone to overfitting, especially when training data is limited. Traditional regularization methods such as L2 weight decay and early stopping provided some relief, but were often insufficient for large networks. Model combination — training multiple models and averaging their predictions — was known to reduce overfitting but was computationally expensive.