Translations:Dropout A Simple Way to Prevent Overfitting/2/en
dropout: A Simple Way to Prevent Neural Networks from overfitting is a 2014 paper by Srivastava et al. published in the Journal of Machine Learning Research. The paper formalized and extensively evaluated dropout, a regularization technique in which randomly selected neurons are temporarily removed during training. dropout prevents complex co-adaptations between neurons, effectively training an exponentially large ensemble of sub-networks within a single architecture, and became one of the most widely used regularization methods in deep learning.