File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Revisiting metric learning for few-shot image classification

TitleRevisiting metric learning for few-shot image classification
Authors
KeywordsMetric learning
Feature representation
Few-shot learning
Deep learning
Issue Date2020
Citation
Neurocomputing, 2020, v. 406, p. 49-58 How to Cite?
AbstractThe goal of few-shot learning is to recognize new visual concepts with just a few amount of labeled samples in each class. Recent effective metric-based few-shot approaches employ neural networks to learn a feature similarity comparison between query and support examples. However, the importance of feature embedding, i.e., exploring the relationship among training samples, is neglected. In this work, we present a simple yet powerful baseline for few-shot classification by emphasizing the importance of feature embedding. Specifically, we revisit the classical triplet network from deep metric learning, and extend it into a deep K-tuplet network for few-shot learning, utilizing the relationship among the input samples to learn a general representation learning via episode-training. Once trained, our network is able to extract discriminative features for unseen novel categories and can be seamlessly incorporated with a non-linear distance metric function to facilitate the few-shot classification. Our result on the miniImageNet benchmark outperforms other metric-based few-shot classification methods. More importantly, when evaluated on completely different datasets (Caltech-101, CUB-200, Stanford Dogs and Cars) using the model trained with miniImageNet, our method significantly outperforms prior methods, demonstrating its superior capability to generalize to unseen classes.
Persistent Identifierhttp://hdl.handle.net/10722/299624
ISSN
2023 Impact Factor: 5.5
2023 SCImago Journal Rankings: 1.815
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorLi, Xiaomeng-
dc.contributor.authorYu, Lequan-
dc.contributor.authorFu, Chi Wing-
dc.contributor.authorFang, Meng-
dc.contributor.authorHeng, Pheng Ann-
dc.date.accessioned2021-05-21T03:34:48Z-
dc.date.available2021-05-21T03:34:48Z-
dc.date.issued2020-
dc.identifier.citationNeurocomputing, 2020, v. 406, p. 49-58-
dc.identifier.issn0925-2312-
dc.identifier.urihttp://hdl.handle.net/10722/299624-
dc.description.abstractThe goal of few-shot learning is to recognize new visual concepts with just a few amount of labeled samples in each class. Recent effective metric-based few-shot approaches employ neural networks to learn a feature similarity comparison between query and support examples. However, the importance of feature embedding, i.e., exploring the relationship among training samples, is neglected. In this work, we present a simple yet powerful baseline for few-shot classification by emphasizing the importance of feature embedding. Specifically, we revisit the classical triplet network from deep metric learning, and extend it into a deep K-tuplet network for few-shot learning, utilizing the relationship among the input samples to learn a general representation learning via episode-training. Once trained, our network is able to extract discriminative features for unseen novel categories and can be seamlessly incorporated with a non-linear distance metric function to facilitate the few-shot classification. Our result on the miniImageNet benchmark outperforms other metric-based few-shot classification methods. More importantly, when evaluated on completely different datasets (Caltech-101, CUB-200, Stanford Dogs and Cars) using the model trained with miniImageNet, our method significantly outperforms prior methods, demonstrating its superior capability to generalize to unseen classes.-
dc.languageeng-
dc.relation.ispartofNeurocomputing-
dc.subjectMetric learning-
dc.subjectFeature representation-
dc.subjectFew-shot learning-
dc.subjectDeep learning-
dc.titleRevisiting metric learning for few-shot image classification-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1016/j.neucom.2020.04.040-
dc.identifier.scopuseid_2-s2.0-85084209390-
dc.identifier.volume406-
dc.identifier.spage49-
dc.identifier.epage58-
dc.identifier.eissn1872-8286-
dc.identifier.isiWOS:000540920100006-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats