File Download

There are no files associated with this item.

Supplementary

Conference Paper: Not all models are equal: Predicting model transferability in a self-challenging fisher space

TitleNot all models are equal: Predicting model transferability in a self-challenging fisher space
Authors
Issue Date2022
PublisherOrtra Ltd..
Citation
European Conference on Computer Vision (ECCV) (Hybrid), Tel Aviv, Israel, October 23-27, 2022 How to Cite?
AbstractThis paper addresses an important problem of ranking the pre-trained deep neural networks and screening the most transferable ones for downstream tasks. It is challenging because the ground-truth model ranking for each task can only be generated by fine-tuning the pre-trained models on the target dataset, which is brute-force and computationally expensive. Recent advanced methods proposed several lightweight transferability metrics to predict the fine-tuning results. However, these approaches only capture static representations but neglect the fine-tuning dynamics. To this end, this paper proposes a new transferability metric, called Self-challenging Fisher Discriminant Analysis (SFDA), which has many appealing benefits that existing works do not have. First, SFDA can embed the static features into a Fisher space and refine them for better separability between classes. Second, SFDA uses a self-challenging mechanism to encourage different pre-trained models to differentiate on hard examples. Third, SFDA can easily select multiple pre-trained models for the model ensemble. Extensive experiments on $33$ pre-trained models of $11$ downstream tasks show that SFDA is efficient, effective, and robust when measuring the transferability of pre-trained models. For instance, compared with the state-of-the-art method NLEEP, SFDA demonstrates an average of $59.1$\% gain while bringing $22.5$x speedup in wall-clock time.
Persistent Identifierhttp://hdl.handle.net/10722/315546

 

DC FieldValueLanguage
dc.contributor.authorShao, W-
dc.contributor.authorZhao, X-
dc.contributor.authorGe, Y-
dc.contributor.authorShan, Y-
dc.contributor.authorLuo, P-
dc.date.accessioned2022-08-19T08:59:54Z-
dc.date.available2022-08-19T08:59:54Z-
dc.date.issued2022-
dc.identifier.citationEuropean Conference on Computer Vision (ECCV) (Hybrid), Tel Aviv, Israel, October 23-27, 2022-
dc.identifier.urihttp://hdl.handle.net/10722/315546-
dc.description.abstractThis paper addresses an important problem of ranking the pre-trained deep neural networks and screening the most transferable ones for downstream tasks. It is challenging because the ground-truth model ranking for each task can only be generated by fine-tuning the pre-trained models on the target dataset, which is brute-force and computationally expensive. Recent advanced methods proposed several lightweight transferability metrics to predict the fine-tuning results. However, these approaches only capture static representations but neglect the fine-tuning dynamics. To this end, this paper proposes a new transferability metric, called Self-challenging Fisher Discriminant Analysis (SFDA), which has many appealing benefits that existing works do not have. First, SFDA can embed the static features into a Fisher space and refine them for better separability between classes. Second, SFDA uses a self-challenging mechanism to encourage different pre-trained models to differentiate on hard examples. Third, SFDA can easily select multiple pre-trained models for the model ensemble. Extensive experiments on $33$ pre-trained models of $11$ downstream tasks show that SFDA is efficient, effective, and robust when measuring the transferability of pre-trained models. For instance, compared with the state-of-the-art method NLEEP, SFDA demonstrates an average of $59.1$\% gain while bringing $22.5$x speedup in wall-clock time.-
dc.languageeng-
dc.publisherOrtra Ltd..-
dc.titleNot all models are equal: Predicting model transferability in a self-challenging fisher space-
dc.typeConference_Paper-
dc.identifier.emailLuo, P: pluo@hku.hk-
dc.identifier.authorityLuo, P=rp02575-
dc.identifier.hkuros335574-
dc.publisher.placeIsrael-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats