Translations:Deep Residual Learning for Image Recognition/4/en

    From Marovi AI
    Revision as of 00:32, 27 April 2026 by FuzzyBot (talk | contribs) (Importing a new version from external source)
    (diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

    As neural networks grew deeper in the mid-2010s, researchers observed a counterintuitive degradation problem: adding more layers to a network eventually caused training accuracy to degrade, not from overfitting but from optimization difficulty. A 56-layer plain network performed worse than a 20-layer network on both training and test sets, indicating that deeper networks were harder to optimize rather than simply more prone to overfitting.