Translations:Dropout A Simple Way to Prevent Overfitting/23/en
dropout became standard practice in neural network training throughout the 2010s, included by default in most deep learning frameworks. Its conceptual simplicity and consistent effectiveness made it one of the most cited papers in machine learning. The idea of stochastic regularization through random perturbation during training influenced many subsequent techniques, including DropConnect, DropBlock, stochastic depth, and data augmentation strategies.