File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Article: Hierarchical Pruning of Deep Ensembles with Focal Diversity

TitleHierarchical Pruning of Deep Ensembles with Focal Diversity
Authors
Keywordsdeep learning
ensemble diversity
ensemble learning
Ensemble pruning
Issue Date2024
Citation
ACM Transactions on Intelligent Systems and Technology, 2024, v. 15, n. 1, article no. 15 How to Cite?
AbstractDeep neural network ensembles combine the wisdom of multiple deep neural networks to improve the generalizability and robustness over individual networks. It has gained increasing popularity to study and apply deep ensemble techniques in the deep learning community. Some mission-critical applications utilize a large number of deep neural networks to form deep ensembles to achieve desired accuracy and resilience, which introduces high time and space costs for ensemble execution. However, it still remains a critical challenge whether a small subset of the entire deep ensemble can achieve the same or better generalizability and how to effectively identify these small deep ensembles for improving the space and time efficiency of ensemble execution. This article presents a novel deep ensemble pruning approach, which can efficiently identify smaller deep ensembles and provide higher ensemble accuracy than the entire deep ensemble of a large number of member networks. Our hierarchical ensemble pruning approach (HQ) leverages three novel ensemble pruning techniques. First, we show that the focal ensemble diversity metrics can accurately capture the complementary capacity of the member networks of an ensemble team, which can guide ensemble pruning. Second, we design a focal ensemble diversity based hierarchical pruning approach, which will iteratively find high quality deep ensembles with low cost and high accuracy. Third, we develop a focal diversity consensus method to integrate multiple focal diversity metrics to refine ensemble pruning results, where smaller deep ensembles can be effectively identified to offer high accuracy, high robustness and high ensemble execution efficiency. Evaluated using popular benchmark datasets, we demonstrate that the proposed hierarchical ensemble pruning approach can effectively identify high quality deep ensembles with better classification generalizability while being more time and space efficient in ensemble decision making. We have released the source codes on GitHub at https://github.com/git-disl/HQ-Ensemble.
Persistent Identifierhttp://hdl.handle.net/10722/343451
ISSN
2023 Impact Factor: 7.2
2023 SCImago Journal Rankings: 1.882

 

DC FieldValueLanguage
dc.contributor.authorYanzhao, W. U.-
dc.contributor.authorChow, Ka Ho-
dc.contributor.authorWei, Wenqi-
dc.contributor.authorLiu, Ling-
dc.date.accessioned2024-05-10T09:08:14Z-
dc.date.available2024-05-10T09:08:14Z-
dc.date.issued2024-
dc.identifier.citationACM Transactions on Intelligent Systems and Technology, 2024, v. 15, n. 1, article no. 15-
dc.identifier.issn2157-6904-
dc.identifier.urihttp://hdl.handle.net/10722/343451-
dc.description.abstractDeep neural network ensembles combine the wisdom of multiple deep neural networks to improve the generalizability and robustness over individual networks. It has gained increasing popularity to study and apply deep ensemble techniques in the deep learning community. Some mission-critical applications utilize a large number of deep neural networks to form deep ensembles to achieve desired accuracy and resilience, which introduces high time and space costs for ensemble execution. However, it still remains a critical challenge whether a small subset of the entire deep ensemble can achieve the same or better generalizability and how to effectively identify these small deep ensembles for improving the space and time efficiency of ensemble execution. This article presents a novel deep ensemble pruning approach, which can efficiently identify smaller deep ensembles and provide higher ensemble accuracy than the entire deep ensemble of a large number of member networks. Our hierarchical ensemble pruning approach (HQ) leverages three novel ensemble pruning techniques. First, we show that the focal ensemble diversity metrics can accurately capture the complementary capacity of the member networks of an ensemble team, which can guide ensemble pruning. Second, we design a focal ensemble diversity based hierarchical pruning approach, which will iteratively find high quality deep ensembles with low cost and high accuracy. Third, we develop a focal diversity consensus method to integrate multiple focal diversity metrics to refine ensemble pruning results, where smaller deep ensembles can be effectively identified to offer high accuracy, high robustness and high ensemble execution efficiency. Evaluated using popular benchmark datasets, we demonstrate that the proposed hierarchical ensemble pruning approach can effectively identify high quality deep ensembles with better classification generalizability while being more time and space efficient in ensemble decision making. We have released the source codes on GitHub at https://github.com/git-disl/HQ-Ensemble.-
dc.languageeng-
dc.relation.ispartofACM Transactions on Intelligent Systems and Technology-
dc.subjectdeep learning-
dc.subjectensemble diversity-
dc.subjectensemble learning-
dc.subjectEnsemble pruning-
dc.titleHierarchical Pruning of Deep Ensembles with Focal Diversity-
dc.typeArticle-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1145/3633286-
dc.identifier.scopuseid_2-s2.0-85183328455-
dc.identifier.volume15-
dc.identifier.issue1-
dc.identifier.spagearticle no. 15-
dc.identifier.epagearticle no. 15-
dc.identifier.eissn2157-6912-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats