The article discusses the limitations of Differentiable Architecture Search (DAS) in the field of Neural Architecture Search (NAS). DAS has been prominent due to its time-efficient automation of neural network design but fails to achieve a balance between small model size and satisfactory model performance. To address these issues, the authors introduce Multi-Granularity Architecture Search (MGAS). This framework allows for the exploration of the multi-granularity search space to discover both effective and efficient neural networks. The authors claim that MGAS outperforms other methods in achieving a better trade-off between model performance and model size.

 

Publication date: 24 Oct 2023
Project Page: Not provided
Paper: https://arxiv.org/pdf/2310.15074