Translations:Neural Networks/20/en: Difference between revisions

    From Marovi AI
    (Importing a new version from external source)
     
    (Importing a new version from external source)
    Line 1: Line 1:
    The '''universal approximation theorem''' (Cybenko 1989, Hornik 1991) states that a feedforward network with a single hidden layer containing a finite number of neurons can approximate any continuous function on a compact subset of <math>\mathbb{R}^n</math> to arbitrary accuracy, provided the activation function satisfies mild conditions (e.g. is non-constant, bounded, and continuous).
    The '''universal approximation theorem''' (Cybenko 1989, Hornik 1991) states that a feedforward network with a single hidden layer containing a finite number of neurons can approximate any continuous function on a compact subset of <math>\mathbb{R}^n</math> to arbitrary accuracy, provided the {{Term|activation function}} satisfies mild conditions (e.g. is non-constant, bounded, and continuous).

    Revision as of 19:42, 27 April 2026

    Information about message (contribute)
    This message has no documentation. If you know where or how this message is used, you can help other translators by adding documentation to this message.
    Message definition (Neural Networks)
    The '''universal approximation theorem''' (Cybenko 1989, Hornik 1991) states that a feedforward network with a single hidden layer containing a finite number of neurons can approximate any continuous function on a compact subset of <math>\mathbb{R}^n</math> to arbitrary accuracy, provided the {{Term|activation function}} satisfies mild conditions (e.g. is non-constant, bounded, and continuous).

    The universal approximation theorem (Cybenko 1989, Hornik 1991) states that a feedforward network with a single hidden layer containing a finite number of neurons can approximate any continuous function on a compact subset of $ \mathbb{R}^n $ to arbitrary accuracy, provided the activation function satisfies mild conditions (e.g. is non-constant, bounded, and continuous).