All translations

Enter a message name below to show all available translations.

Message

Found 3 translations.

NameCurrent message text
 h English (en)Dropout typically increases training loss and slows {{Term|convergence}}, since the effective model {{Term|capacity}} is reduced at each {{Term|training step|step}}. However, it decreases the gap between training and validation performance, leading to better {{Term|generalization}}. If training loss is already high ({{Term|underfitting}}), dropout should be reduced or removed.
 h Spanish (es)El dropout normalmente aumenta la pérdida de entrenamiento y ralentiza la {{Term|convergence|convergencia}}, ya que la {{Term|capacity|capacidad}} efectiva del modelo se reduce en cada {{Term|training step|paso}}. Sin embargo, disminuye la brecha entre el rendimiento de entrenamiento y de validación, llevando a una mejor {{Term|generalization|generalización}}. Si la pérdida de entrenamiento ya es alta ({{Term|underfitting|subajuste}}), el dropout debe reducirse o eliminarse.
 h Chinese (zh)Dropout 通常会增加训练损失并减慢{{Term|convergence|收敛}}速度,因为每个{{Term|training step|训练步}}的有效模型{{Term|capacity|容量}}都会减少。然而,它缩小了训练和验证性能之间的差距,带来更好的{{Term|generalization|泛化能力}}。如果训练损失已经很高({{Term|underfitting|欠拟合}}),则应减少或移除 dropout。