File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: DCAN: Deep Contour-Aware Networks for Accurate Gland Segmentation

TitleDCAN: Deep Contour-Aware Networks for Accurate Gland Segmentation
Authors
Issue Date2016
Citation
Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2016, v. 2016-December, p. 2487-2496 How to Cite?
Abstract© 2016 IEEE. The morphology of glands has been used routinely by pathologists to assess the malignancy degree of adenocarcinomas. Accurate segmentation of glands from histology images is a crucial step to obtain reliable morphological statistics for quantitative diagnosis. In this paper, we proposed an efficient deep contour-aware network (DCAN) to solve this challenging problem under a unified multi-task learning framework. In the proposed network, multi-level contextual features from the hierarchical architecture are explored with auxiliary supervision for accurate gland segmentation. When incorporated with multi-task regularization during the training, the discriminative capability of intermediate features can be further improved. Moreover, our network can not only output accurate probability maps of glands, but also depict clear contours simultaneously for separating clustered objects, which further boosts the gland segmentation performance. This unified framework can be efficient when applied to large-scale histopathological data without resorting to additional steps to generate contours based on low-level cues for post-separating. Our method won the 2015 MICCAI Gland Segmentation Challenge out of 13 competitive teams, surpassing all the other methods by a significant margin.
Persistent Identifierhttp://hdl.handle.net/10722/281957
ISSN
2020 SCImago Journal Rankings: 4.658
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorChen, Hao-
dc.contributor.authorQi, Xiaojuan-
dc.contributor.authorYu, Lequan-
dc.contributor.authorHeng, Pheng Ann-
dc.date.accessioned2020-04-09T09:19:13Z-
dc.date.available2020-04-09T09:19:13Z-
dc.date.issued2016-
dc.identifier.citationProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2016, v. 2016-December, p. 2487-2496-
dc.identifier.issn1063-6919-
dc.identifier.urihttp://hdl.handle.net/10722/281957-
dc.description.abstract© 2016 IEEE. The morphology of glands has been used routinely by pathologists to assess the malignancy degree of adenocarcinomas. Accurate segmentation of glands from histology images is a crucial step to obtain reliable morphological statistics for quantitative diagnosis. In this paper, we proposed an efficient deep contour-aware network (DCAN) to solve this challenging problem under a unified multi-task learning framework. In the proposed network, multi-level contextual features from the hierarchical architecture are explored with auxiliary supervision for accurate gland segmentation. When incorporated with multi-task regularization during the training, the discriminative capability of intermediate features can be further improved. Moreover, our network can not only output accurate probability maps of glands, but also depict clear contours simultaneously for separating clustered objects, which further boosts the gland segmentation performance. This unified framework can be efficient when applied to large-scale histopathological data without resorting to additional steps to generate contours based on low-level cues for post-separating. Our method won the 2015 MICCAI Gland Segmentation Challenge out of 13 competitive teams, surpassing all the other methods by a significant margin.-
dc.languageeng-
dc.relation.ispartofProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition-
dc.titleDCAN: Deep Contour-Aware Networks for Accurate Gland Segmentation-
dc.typeConference_Paper-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/CVPR.2016.273-
dc.identifier.scopuseid_2-s2.0-84986267644-
dc.identifier.volume2016-December-
dc.identifier.spage2487-
dc.identifier.epage2496-
dc.identifier.isiWOS:000400012302059-
dc.identifier.issnl1063-6919-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats