All translations

Enter a message name below to show all available translations.

Message

Found 3 translations.

NameCurrent message text
 h English (en)'''{{Term|dropout}}: A Simple Way to Prevent Neural Networks from {{Term|overfitting}}''' is a 2014 paper by Srivastava et al. published in the Journal of Machine Learning Research. The paper formalized and extensively evaluated '''{{Term|dropout}}''', a {{Term|regularization}} technique in which randomly selected neurons are temporarily removed during training. {{Term|dropout}} prevents complex co-adaptations between neurons, effectively training an exponentially large ensemble of sub-networks within a single architecture, and became one of the most widely used {{Term|regularization}} methods in {{Term|deep learning}}.
 h Spanish (es)'''{{Term|dropout|dropout}}: una forma sencilla de prevenir el {{Term|overfitting|sobreajuste}} en redes neuronales''' es un artículo de 2014 de Srivastava et al. publicado en el Journal of Machine Learning Research. El artículo formalizó y evaluó exhaustivamente el '''{{Term|dropout|dropout}}''', una técnica de {{Term|regularization|regularización}} en la que neuronas seleccionadas aleatoriamente se eliminan temporalmente durante el entrenamiento. El {{Term|dropout|dropout}} evita coadaptaciones complejas entre neuronas, entrenando efectivamente un conjunto exponencialmente grande de subredes dentro de una única arquitectura, y se convirtió en uno de los métodos de {{Term|regularization|regularización}} más utilizados en {{Term|deep learning|aprendizaje profundo}}.
 h Chinese (zh)'''{{Term|dropout|Dropout}}:一种防止神经网络{{Term|overfitting|过拟合}}的简单方法''' 是 Srivastava 等人于 2014 年发表在《Journal of Machine Learning Research》上的论文。该论文系统地形式化并广泛评估了 '''{{Term|dropout|dropout}}''',一种通过在训练期间临时移除随机选择的神经元来实现的{{Term|regularization|正则化}}技术。{{Term|dropout|Dropout}} 可防止神经元之间复杂的共适应,在单一架构内有效训练指数级数量的子网络集成,成为{{Term|deep learning|深度学习}}中应用最广泛的{{Term|regularization|正则化}}方法之一。