Translations:Language Models are Few-Shot Learners/2/en

    From Marovi AI
    Revision as of 21:39, 27 April 2026 by FuzzyBot (talk | contribs) (Importing a new version from external source)

    Language Models are Few-Shot Learners is a 2020 paper by Brown et al. from OpenAI that introduced GPT-3, a 175-billion-parameter autoregressive language model. The paper demonstrated that sufficiently large language models can perform a wide variety of NLP tasks through in-context learning — simply by conditioning on a few examples provided in the prompt — without any gradient updates or fine-tuning.