File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
  • Find via Find It@HKUL
Supplementary

Conference Paper: Synergistic analysis of micro-positron emission tomography/MRI data using deep learning for automatic detection of HCC

TitleSynergistic analysis of micro-positron emission tomography/MRI data using deep learning for automatic detection of HCC
Other TitlesSynergistic Analysis of Micro PET/MRI Data Using Deep Learning for Automatic Detection of Hepatocellular Carcinoma
Authors
Issue Date2019
PublisherSpringerOpen. The Journal's web site is located at http://www.springer.com/medicine/radiology/journal/13244
Citation
30th European Society of Gastrointestinal and Abdominal Radiology Annual Meeting (ESGAR 2019), Rome, Italy, 5-8 June 2019. In Insights into Imaging, 2019, v. 10 n. Suppl. 2, p. 25, article no. SS 7.5 How to Cite?
AbstractPurpose: The purpose of this study is to investigate the benefit of synergistic analysis of positron emission tomography (PET) and MRI data using deep learning for automatic detection and segmentation of HCC. Material and methods: The micro-PET/MRI data were retrospectively collected from an animal study with orthotropic HCC tumor model conducted in our institution. Totally thirty-eight sets of coronal F18-FDG and corresponding T2WI images were selected for preliminary test. The labeled images were generated by drawing the ROI of tumors on T2WI images. Afterward, 28 and 10 sets of images were, respectively, used for network training and validation. The SegNet network architecture was selected for this implementation with a multi-channel data input and was pre-trained with BRATS data (brain data with 600,000 steps). For micro-PET/MRI data, the training steps were 10,000 with 6 images for each step. Three types of inputs (both F18-FDG and T2WI, F18-FDG only, and T2WI only) were tested for the efficacy of tumor detection using SegNet. Results: The combination of PET and MRI information (F18-FDG and T2WI) provided a best mean dice coefficient (0.69), compared to either using PET data (0.47) or MRI data (0.58) for tumor detection and segmentation. Conclusion: This preliminary study shows that the synergistic analysis of PET and MRI data using deep learning is feasible for automatic detection and segmentation of HCC, and provides better performance than using either individual PET or MRI data. Additionally, the transfer learning can ensure a proper training of the network, even though with limited amount of training data.
DescriptionV. 10, suppl. 2 has special title: ESGAR 2019 Book of Abstracts
Persistent Identifierhttp://hdl.handle.net/10722/275238
ISSN
2021 Impact Factor: 5.036
2020 SCImago Journal Rankings: 1.405

 

DC FieldValueLanguage
dc.contributor.authorChang, HCC-
dc.contributor.authorHuang, TY-
dc.contributor.authorHui, SK-
dc.contributor.authorChiu, WHK-
dc.date.accessioned2019-09-10T02:38:27Z-
dc.date.available2019-09-10T02:38:27Z-
dc.date.issued2019-
dc.identifier.citation30th European Society of Gastrointestinal and Abdominal Radiology Annual Meeting (ESGAR 2019), Rome, Italy, 5-8 June 2019. In Insights into Imaging, 2019, v. 10 n. Suppl. 2, p. 25, article no. SS 7.5-
dc.identifier.issn1869-4101-
dc.identifier.urihttp://hdl.handle.net/10722/275238-
dc.descriptionV. 10, suppl. 2 has special title: ESGAR 2019 Book of Abstracts-
dc.description.abstractPurpose: The purpose of this study is to investigate the benefit of synergistic analysis of positron emission tomography (PET) and MRI data using deep learning for automatic detection and segmentation of HCC. Material and methods: The micro-PET/MRI data were retrospectively collected from an animal study with orthotropic HCC tumor model conducted in our institution. Totally thirty-eight sets of coronal F18-FDG and corresponding T2WI images were selected for preliminary test. The labeled images were generated by drawing the ROI of tumors on T2WI images. Afterward, 28 and 10 sets of images were, respectively, used for network training and validation. The SegNet network architecture was selected for this implementation with a multi-channel data input and was pre-trained with BRATS data (brain data with 600,000 steps). For micro-PET/MRI data, the training steps were 10,000 with 6 images for each step. Three types of inputs (both F18-FDG and T2WI, F18-FDG only, and T2WI only) were tested for the efficacy of tumor detection using SegNet. Results: The combination of PET and MRI information (F18-FDG and T2WI) provided a best mean dice coefficient (0.69), compared to either using PET data (0.47) or MRI data (0.58) for tumor detection and segmentation. Conclusion: This preliminary study shows that the synergistic analysis of PET and MRI data using deep learning is feasible for automatic detection and segmentation of HCC, and provides better performance than using either individual PET or MRI data. Additionally, the transfer learning can ensure a proper training of the network, even though with limited amount of training data.-
dc.languageeng-
dc.publisherSpringerOpen. The Journal's web site is located at http://www.springer.com/medicine/radiology/journal/13244-
dc.relation.ispartofInsights into Imaging-
dc.relation.ispartofThe Annual Meeting of European Society of Gastrointestinal and Abdominal Radiology (ESGAR)-
dc.titleSynergistic analysis of micro-positron emission tomography/MRI data using deep learning for automatic detection of HCC-
dc.title.alternativeSynergistic Analysis of Micro PET/MRI Data Using Deep Learning for Automatic Detection of Hepatocellular Carcinoma-
dc.typeConference_Paper-
dc.identifier.emailChang, HCC: hcchang@hku.hk-
dc.identifier.emailHui, SK: edshui@hku.hk-
dc.identifier.emailChiu, WHK: kwhchiu@hku.hk-
dc.identifier.authorityChang, HCC=rp02024-
dc.identifier.authorityHui, SK=rp01832-
dc.identifier.authorityChiu, WHK=rp02074-
dc.identifier.hkuros303940-
dc.identifier.volume10-
dc.identifier.issueSuppl. 2-
dc.identifier.spage25, article no. SS 7.5-
dc.identifier.epage25, article no. SS 7.5-
dc.publisher.placeGermany-
dc.identifier.issnl1869-4101-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats