The article presents a study on metalearning and multitask learning, focusing on tasks related by a shared representation in a binary classification setting. The key question is to determine how much data is required to metalearn a good representation. The authors develop a theory for distribution-free metalearning and multitask learning, which identifies the properties that make metalearning possible with few samples per task. The study concludes that metalearning can be achieved with remarkably few samples per task.

 

Publication date: 21 Dec 2023
Project Page: https://arxiv.org/abs/2312.13978
Paper: https://arxiv.org/pdf/2312.13978