Article: Respiratory Motion Correction in Abdominal MRI using a Densely Connected U-Net with GAN-guided Training

TitleRespiratory Motion Correction in Abdominal MRI using a Densely Connected U-Net with GAN-guided Training
Authors
KeywordsMotion correction
Motion artifacts
Abdominal MRI
Issue Date2019
Citation
Submitted to the 22nd International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI 2019), Shenzhen, China, 13-17 October 2019. In arXiv preprint, 2019 How to Cite?
AbstractAbdominal magnetic resonance imaging (MRI) provides a straightforward way of characterizing tissue and locating lesions of patients as in standard diagnosis. However, abdominal MRI often suffers from respiratory motion artifacts, which leads to blurring and ghosting that significantly deteriorate the imaging quality. Conventional methods to reduce or eliminate these motion artifacts include breath holding, patient sedation, respiratory gating, and image post-processing, but these strategies inevitably involve extra scanning time and patient discomfort. In this paper, we propose a novel deep-learning-based model to recover MR images from respiratory motion artifacts. The proposed model comprises a densely connected U-net with generative adversarial network (GAN)-guided training and a perceptual loss function. We validate the model using a diverse collection of MRI data that are adversely affected by both synthetic and authentic respiration artifacts. Effective outcomes of motion removal are demonstrated. Our experimental results show the great potential of utilizing deep-learning-based methods in respiratory motion correction for abdominal MRI. [https://arxiv.org/abs/1906.09745]
Persistent Identifierhttp://hdl.handle.net/10722/272885

 

DC FieldValueLanguage
dc.contributor.authorJIANG, W-
dc.contributor.authorLIU, Z-
dc.contributor.authorCHEN, S-
dc.contributor.authorNg, YL-
dc.contributor.authorDou, Q-
dc.contributor.authorChang, HCC-
dc.contributor.authorKwok, KW-
dc.date.accessioned2019-08-06T09:18:25Z-
dc.date.available2019-08-06T09:18:25Z-
dc.date.issued2019-
dc.identifier.citationSubmitted to the 22nd International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI 2019), Shenzhen, China, 13-17 October 2019. In arXiv preprint, 2019-
dc.identifier.urihttp://hdl.handle.net/10722/272885-
dc.description.abstractAbdominal magnetic resonance imaging (MRI) provides a straightforward way of characterizing tissue and locating lesions of patients as in standard diagnosis. However, abdominal MRI often suffers from respiratory motion artifacts, which leads to blurring and ghosting that significantly deteriorate the imaging quality. Conventional methods to reduce or eliminate these motion artifacts include breath holding, patient sedation, respiratory gating, and image post-processing, but these strategies inevitably involve extra scanning time and patient discomfort. In this paper, we propose a novel deep-learning-based model to recover MR images from respiratory motion artifacts. The proposed model comprises a densely connected U-net with generative adversarial network (GAN)-guided training and a perceptual loss function. We validate the model using a diverse collection of MRI data that are adversely affected by both synthetic and authentic respiration artifacts. Effective outcomes of motion removal are demonstrated. Our experimental results show the great potential of utilizing deep-learning-based methods in respiratory motion correction for abdominal MRI. [https://arxiv.org/abs/1906.09745]-
dc.languageeng-
dc.relation.ispartofarXiv preprint-
dc.subjectMotion correction-
dc.subjectMotion artifacts-
dc.subjectAbdominal MRI-
dc.titleRespiratory Motion Correction in Abdominal MRI using a Densely Connected U-Net with GAN-guided Training-
dc.typeArticle-
dc.identifier.emailNg, YL: owenylng@HKUCC-COM.hku.hk-
dc.identifier.emailChang, HCC: hcchang@hku.hk-
dc.identifier.emailKwok, KW: kwokkw@hku.hk-
dc.identifier.authorityChang, HCC=rp02024-
dc.identifier.authorityKwok, KW=rp01924-
dc.description.naturepreprint-
dc.identifier.hkuros300140-
dc.identifier.hkuros310161-
dc.identifier.hkuros310278-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats