The study presents Simulated Overparametrization (SOP), a novel paradigm that combines the efficiency of compact models with the advanced learning proficiencies of overparameterized models. SOP trains a larger model in a way that a smaller, efficient subset of parameters is used for computation during inference. The study introduces a new algorithm, ‘majority kernels’, which enables simulated training of overparameterized models, resulting in performance gains. The approach shows strong performance on various datasets and models, outperforming other methods such as combinatorial optimization based on submodular optimization.

 

Publication date: 8 Feb 2024
Project Page: Not provided
Paper: https://arxiv.org/pdf/2402.05033