Self-Supervised Prototypical Transfer Learning for Few-Shot Classification

Recent advances in transfer learning and few-shot learning largely rely on annotated data related to the goal task during (pre-)training. However, collecting sufficiently similar and annotated data is often infeasible. Building on advances in self-supervised and few-shot learning, we propose to learn a metric embedding that clusters unlabeled samples and their augmentations closely together. This pre-trained embedding serves as a starting point for classification with limited labeled goal task data by summarizing class clusters and fine-tuning. Experiments show that our approach significantly outperforms state-of the-art unsupervised meta-learning approaches, and is on par with supervised performance. In a cross-domain setting, our approach is competitive with its classical fully supervised counterpart.

Published in:
[Online proceedings - AutoML 2020]
Presented at:
7th ICML Workshop on Automated Machine Learning (AutoML 2020), Vienna, Austria, Jul 12, 2020 – Jul 18, 2020
Jul 18 2020
Additional link:

Note: The status of this file is: Anyone

 Record created 2020-07-27, last modified 2020-07-28

Download fulltext

Rate this document:

Rate this document:
(Not yet reviewed)