The article introduces a new framework that leverages the normalization layer in adapters with Progressive Learning and Adaptive Distillation (ProLAD) for Cross-Domain Few-Shot Learning (CD-FSL). The framework uses two separate adapters – one without a normalization layer effective for similar domains and another with a normalization layer for dissimilar domains. This model also adopts two strategies to handle noisy statistics – a progressive training of the two adapters and an adaptive distillation technique. Evaluations on standard cross-domain few-shot learning benchmarks show that this technique outperforms existing methods.

 

Publication date: 19 Dec 2023
Project Page: Not provided
Paper: https://arxiv.org/pdf/2312.11260