File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/TMI.2020.3038828
- Scopus: eid_2-s2.0-85116525619
- PMID: 33201808
- WOS: WOS:000702638800020
- Find via
Supplementary
- Citations:
- Appears in Collections:
Article: Dual-Teacher plus plus : Exploiting Intra-Domain and Inter-Domain Knowledge With Reliable Transfer for Cardiac Segmentation
Title | Dual-Teacher plus plus : Exploiting Intra-Domain and Inter-Domain Knowledge With Reliable Transfer for Cardiac Segmentation |
---|---|
Authors | |
Keywords | cardiac segmentation cross-modality Semi-supervised domain adaptation |
Issue Date | 1-Oct-2021 |
Publisher | Institute of Electrical and Electronics Engineers |
Citation | IEEE Transactions on Medical Imaging, 2021, v. 40, n. 10, p. 2771-2782 How to Cite? |
Abstract | Annotation scarcity is a long-standing problem in medical image analysis area. To efficiently leverage limited annotations, abundant unlabeled data are additionally exploited in semi-supervised learning, while well-established cross-modality data are investigated in domain adaptation. In this paper, we aim to explore the feasibility of concurrently leveraging both unlabeled data and cross-modality data for annotation-efficient cardiac segmentation. To this end, we propose a cutting-edge semi-supervised domain adaptation framework, namely Dual-Teacher++. Besides directly learning from limited labeled target domain data (e.g., CT) via a student model adopted by previous literature, we design novel dual teacher models, including an inter-domain teacher model to explore cross-modality priors from source domain (e.g., MR) and an intra-domain teacher model to investigate the knowledge beneath unlabeled target domain. In this way, the dual teacher models would transfer acquired inter- and intra-domain knowledge to the student model for further integration and exploitation. Moreover, to encourag reliable dual-domain knowledge transfer, we enhance the inter-domain knowledge transfer on the samples with higher similarity to target domain after appearance alignment, and also strengthen intra-domain knowledge transfer of unlabeled target data with higher prediction confidence. In this way, the student model can obtain reliable dual-domain knowledge and yield improved performance on target domain data. We extensively evaluated the feasibility of our method on the MM-WHS 2017 challenge dataset. The experiments have demonstrated the superiority of our framework over other semi-supervised learning and domain adaptation methods. Moreover, our performance gains could be yielded in bidirections, i.e., adapting from MR to CT, and from CT to MR. Our code will be available at https://github.com/kli-lalala/Dual-Teacher-. |
Persistent Identifier | http://hdl.handle.net/10722/337270 |
ISSN | 2023 Impact Factor: 8.9 2023 SCImago Journal Rankings: 3.703 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Li, K | - |
dc.contributor.author | Wang, SJ | - |
dc.contributor.author | Yu, LQ | - |
dc.contributor.author | Heng, PA | - |
dc.date.accessioned | 2024-03-11T10:19:23Z | - |
dc.date.available | 2024-03-11T10:19:23Z | - |
dc.date.issued | 2021-10-01 | - |
dc.identifier.citation | IEEE Transactions on Medical Imaging, 2021, v. 40, n. 10, p. 2771-2782 | - |
dc.identifier.issn | 0278-0062 | - |
dc.identifier.uri | http://hdl.handle.net/10722/337270 | - |
dc.description.abstract | <p>Annotation scarcity is a long-standing problem in medical image analysis area. To efficiently leverage limited annotations, abundant unlabeled data are additionally exploited in semi-supervised learning, while well-established cross-modality data are investigated in domain adaptation. In this paper, we aim to explore the feasibility of concurrently leveraging both unlabeled data and cross-modality data for annotation-efficient cardiac segmentation. To this end, we propose a cutting-edge semi-supervised domain adaptation framework, namely Dual-Teacher++. Besides directly learning from limited labeled target domain data (e.g., CT) via a student model adopted by previous literature, we design novel dual teacher models, including an inter-domain teacher model to explore cross-modality priors from source domain (e.g., MR) and an intra-domain teacher model to investigate the knowledge beneath unlabeled target domain. In this way, the dual teacher models would transfer acquired inter- and intra-domain knowledge to the student model for further integration and exploitation. Moreover, to encourag reliable dual-domain knowledge transfer, we enhance the inter-domain knowledge transfer on the samples with higher similarity to target domain after appearance alignment, and also strengthen intra-domain knowledge transfer of unlabeled target data with higher prediction confidence. In this way, the student model can obtain reliable dual-domain knowledge and yield improved performance on target domain data. We extensively evaluated the feasibility of our method on the MM-WHS 2017 challenge dataset. The experiments have demonstrated the superiority of our framework over other semi-supervised learning and domain adaptation methods. Moreover, our performance gains could be yielded in bidirections, i.e., adapting from MR to CT, and from CT to MR. Our code will be available at https://github.com/kli-lalala/Dual-Teacher-.</p> | - |
dc.language | eng | - |
dc.publisher | Institute of Electrical and Electronics Engineers | - |
dc.relation.ispartof | IEEE Transactions on Medical Imaging | - |
dc.rights | This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. | - |
dc.subject | cardiac segmentation | - |
dc.subject | cross-modality | - |
dc.subject | Semi-supervised domain adaptation | - |
dc.title | Dual-Teacher plus plus : Exploiting Intra-Domain and Inter-Domain Knowledge With Reliable Transfer for Cardiac Segmentation | - |
dc.type | Article | - |
dc.identifier.doi | 10.1109/TMI.2020.3038828 | - |
dc.identifier.pmid | 33201808 | - |
dc.identifier.scopus | eid_2-s2.0-85116525619 | - |
dc.identifier.volume | 40 | - |
dc.identifier.issue | 10 | - |
dc.identifier.spage | 2771 | - |
dc.identifier.epage | 2782 | - |
dc.identifier.eissn | 1558-254X | - |
dc.identifier.isi | WOS:000702638800020 | - |
dc.publisher.place | PISCATAWAY | - |
dc.identifier.issnl | 0278-0062 | - |