Translations:Neural Networks/18/zh

    From Marovi AI
    Revision as of 05:26, 28 April 2026 by DeployBot (talk | contribs) (Batch translate Neural Networks unit 18 → zh)
    (diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
    函数 公式 范围 备注
    sigmoid $ \sigma(z) = \frac{1}{1+e^{-z}} $ (0, 1) 历史上流行;存在梯度消失问题
    tanh $ \tanh(z) = \frac{e^z - e^{-z}}{e^z + e^{-z}} $ (−1, 1) 以零为中心;对于较大输入仍会饱和
    ReLU $ \max(0, z) $ [0, ∞) 现代网络的默认选择;可能导致"神经元死亡"
    leaky ReLU $ \max(\alpha z, z) $,其中 $ \alpha > 0 $ 较小 (−∞, ∞) 解决神经元死亡问题
    softmax $ \frac{e^{z_i}}{\sum_j e^{z_j}} $ (0, 1) 用于多分类输出层