Translations:Neural Networks/18/zh

    From Marovi AI
    Revision as of 00:25, 27 April 2026 by DeployBot (talk | contribs) ([deploy-bot] Translate Neural Networks unit 18 to zh)
    (diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
    函数 公式 取值范围 备注
    Sigmoid $ \sigma(z) = \frac{1}{1+e^{-z}} $ (0, 1) 历史上常用;存在梯度消失问题
    Tanh $ \tanh(z) = \frac{e^z - e^{-z}}{e^z + e^{-z}} $ (−1, 1) 以零为中心;对大输入仍会饱和
    ReLU $ \max(0, z) $ [0, ∞) 现代网络的默认选择;可能导致"死亡神经元"
    Leaky ReLU $ \max(\alpha z, z) $,其中 $ \alpha > 0 $ 较小 (−∞, ∞) 缓解死亡神经元问题
    Softmax $ \frac{e^{z_i}}{\sum_j e^{z_j}} $ (0, 1) 用于多分类输出层