Zephyrnet Logo

OpenAI at NeurIPS 2020

Date:

We demonstrate that scaling up language models greatly improves task-agnostic, few-shot performance, sometimes even becoming competitive with prior state-of-the-art fine-tuning approaches. Specifically, we train GPT-3, an autoregressive language model with 175 billion parameters, 10x more than any previous non-sparse language model, and test its performance in the few-shot setting.

Coinsmart. Beste Bitcoin-Börse in Europa
Source: https://openai.com/blog/neurips-2020/

spot_img

Latest Intelligence

spot_img

Chat with us

Hi there! How can I help you?