Translations:Dropout A Simple Way to Prevent Overfitting/15/en

    From Marovi AI

    This weight scaling inference rule ensures that the expected output of each neuron at test time equals its expected output during training. An equivalent alternative, inverted dropout, scales activations by $ 1/p $ during training so that no modification is needed at test time. This approach is more common in modern implementations.