All translations

Enter a message name below to show all available translations.

Message

Found 3 translations.

NameCurrent message text
 h English (en)'''{{Term|dropout}}''' (Srivastava et al., 2014) is a regularization technique specific to neural networks. During training, each neuron is randomly "dropped" (set to zero) with probability <math>p</math> at each forward pass. This prevents neurons from co-adapting and forces the network to learn redundant representations.
 h Spanish (es)'''{{Term|dropout}}''' (Srivastava et al., 2014) es una técnica de regularización específica para redes neuronales. Durante el entrenamiento, cada neurona se "descarta" aleatoriamente (se pone a cero) con probabilidad <math>p</math> en cada paso hacia adelante. Esto impide que las neuronas se coadapten y obliga a la red a aprender representaciones redundantes.
 h Chinese (zh)'''{{Term|dropout}}'''(Srivastava 等,2014)是一种专门针对神经网络的正则化技术。在训练过程中,每个神经元在每次前向传播时以概率 <math>p</math> 被随机“丢弃”(置为零)。这可以防止神经元之间的共适应,并迫使网络学习冗余的表示。