We demonstrate that scaling up language models greatly improves task-agnostic, few-shot performance, sometimes even becoming competitive with prior state-of-the-art fine-tuning approaches. Specifically, we...
NeurIPS is the largest machine learning conference held every December. It brings together researchers in computational neuroscience, reinforcement learning, deep learning, and their...
Our team reviewed the papers accepted to NeurIPS 2020 and shortlisted the most interesting ones across different research areas. Here are the topics...
Our team reviewed the papers accepted to NeurIPS 2020 and shortlisted the most interesting ones across different research areas. Here are the topics...
Our team reviewed the papers accepted to NeurIPS 2020 and shortlisted the most interesting ones across different research areas. Here are the topics...
This year’s Annual Conference on Neural Information Processing (NeurIPS 2020) is held 100% virtually from December 6th to 12th, 2020. Historically NeurIPS sells out...
Increasingly, artificial intelligence systems known as deep learning neural networks are used to inform decisions vital to human health and safety, such as...
Will transformers revolutionize computer vision like they did with natural language processing? That’s one of the major research questions investigated by computer vision scientists...
Further reading GPE, successor features, and related approaches
Improving Generalisation for Temporal Difference Learning: The Successor Representation. Peter Dayan. Neural Computation, 1993.
Apprenticeship Learning Via Inverse...
Convolutional neural networks (CNNs) achieve state-of-the-art results in tasks such as image classification and object detection. They are used in many diverse applications,...