File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: Understanding Imbalanced Semantic Segmentation Through Neural Collapse

TitleUnderstanding Imbalanced Semantic Segmentation Through Neural Collapse
Authors
Issue Date22-Aug-2023
Abstract

A recent study has shown a phenomenon called neural collapse in that the within-class means of features and the classifier weight vectors converge to the vertices of a simplex equiangular tight frame at the terminal phase of training for classification. In this paper, we explore the cor-responding structures of the last-layer feature centers and classifiers in semantic segmentation. Based on our empirical and theoretical analysis, we point out that semantic segmentation naturally brings contextual correlation and imbalanced distribution among classes, which breaks the equiangular and maximally separated structure of neural collapse for both feature centers and classifiers. However, such a symmetric structure is beneficial to discrimination for the minor classes. To preserve these advantages, we in-troduce a regularizer on feature centers to encourage the network to learn features closer to the appealing structure in imbalanced semantic segmentation. Experimental results show that our method can bring significant improvements on both 2D and 3D semantic segmentation bench-marks. Moreover, our method ranks 1 st and sets a new record (+6.8% mIoU) on the ScanNet200 test leaderboard.


Persistent Identifierhttp://hdl.handle.net/10722/333840

 

DC FieldValueLanguage
dc.contributor.authorZhong, Zhisheng-
dc.contributor.authorCui, Jiequan-
dc.contributor.authorYang, Yibo-
dc.contributor.authorWu, Xiaoyang-
dc.contributor.authorQi, Xiaojuan-
dc.contributor.authorZhang, Xiangyu-
dc.contributor.authorJia, Jiaya-
dc.date.accessioned2023-10-06T08:39:31Z-
dc.date.available2023-10-06T08:39:31Z-
dc.date.issued2023-08-22-
dc.identifier.urihttp://hdl.handle.net/10722/333840-
dc.description.abstract<p>A recent study has shown a phenomenon called neural collapse in that the within-class means of features and the classifier weight vectors converge to the vertices of a simplex equiangular tight frame at the terminal phase of training for classification. In this paper, we explore the cor-responding structures of the last-layer feature centers and classifiers in semantic segmentation. Based on our empirical and theoretical analysis, we point out that semantic segmentation naturally brings contextual correlation and imbalanced distribution among classes, which breaks the equiangular and maximally separated structure of neural collapse for both feature centers and classifiers. However, such a symmetric structure is beneficial to discrimination for the minor classes. To preserve these advantages, we in-troduce a regularizer on feature centers to encourage the network to learn features closer to the appealing structure in imbalanced semantic segmentation. Experimental results show that our method can bring significant improvements on both 2D and 3D semantic segmentation bench-marks. Moreover, our method ranks 1 <sup>st</sup> and sets a new record (+6.8% mIoU) on the ScanNet200 test leaderboard.<br></p>-
dc.languageeng-
dc.relation.ispartof2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (17/06/2023-24/06/2023, Vancouver, BC, Canada)-
dc.titleUnderstanding Imbalanced Semantic Segmentation Through Neural Collapse-
dc.typeConference_Paper-
dc.identifier.doi10.1109/CVPR52729.2023.01873-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats