Abstract
Few-shot Learning (FSL) is a challenging problem that aims to learn and generalize from limited examples. Recent works have adopted a combination of meta-learning and transfer learning strategies for FSL tasks. These methods perform pre-training and transfer the learned knowledge to meta-learning. However, it remains unclear whether this transfer pattern is appropriate, and the objectives of the two learning strategies have not been explored. In addition, the inference of meta-learning in FSL relies on sample relations that require further consideration. In this paper, we uncover an overlooked discrepancy in learning objectives between pre-training and meta-learning strategies and propose a simple yet effective learning paradigm for the few-shot classification task. Specifically, the proposed method comprises two components: (i) Detach: We formulate an effective learning paradigm, Adaptive Meta-Transfer (A-MET), which adaptively eliminates undesired representations learned by pre-training to address the discrepancy. (ii) Unite: We propose a Global Similarity Compatibility Measure (GSCM) to jointly consider sample correlation at a global level for more consistent predictions. The proposed method is simple to implement without any complex components. Extensive experiments on four public benchmarks demonstrate that our method outperforms other state-of-the-art methods under more challenging scenarios with large domain differences between the base and novel classes and less support information available. Code is available at: https://github.com/yaoyz96/a-met.
| Original language | English |
|---|---|
| Article number | 110798 |
| Journal | Knowledge-Based Systems |
| Volume | 277 |
| DOIs | |
| State | Published - 9 Oct 2023 |
Keywords
- Few-shot learning
- Image classification
- Meta-learning
- Transfer learning