File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: A Two-Stage Approach for Automated Prostate Lesion Detection and Classification with Mask R-CNN and Weakly Supervised Deep Neural Network

TitleA Two-Stage Approach for Automated Prostate Lesion Detection and Classification with Mask R-CNN and Weakly Supervised Deep Neural Network
Authors
KeywordsProstate cancer
MR images classification
Weakly supervised learning
Issue Date2019
PublisherSpringer.
Citation
Proceedings of the First International Workshop on Artificial Intelligence in Radiation Therapy, AIRT 2019, held in conjunction with the 22nd Medical Image Computing and Computer Assisted Intervention (MICCAI) International Conference 2019, Shenzhen, China, 17 October 2019, p. 43-51 How to Cite?
AbstractEarly diagnosis of prostate cancer is very crucial to reduce the mortality rate. Multi-parametric magnetic resonance imaging (MRI) can provide detailed visualization of prostate tissues and lesions. Their malignancy can be diagnosed before any necessary invasive approaches, such as needle biopsy, at the risk of damage to or inflammation of the periprostatic nerves, prostate and bladder neck. However, the prostate tissue malignancy on magnetic resonance (MR) images can also be difficult to determine, with often inconclusive results among the clinicians. With the progress in artificial intelligence (AI), research on MR image-based lesion classification with AI tools are being explored increasingly. So far, existing classification approaches heavily rely on manually labelling of lesion areas, which is a labor-intensive and time-consuming process. In this paper, we present a novel two-stage method for fully-automated prostate lesion detection and classification, using input sequences of T2-weighted images, apparent diffusion coefficient (ADC) maps and high b-value diffusion-weighted images. In the first stage, a Mask R-CNN model is trained to automatically segment prostate structures. In the second stage, a weakly supervised deep neural network is developed to detect and classify lesions in a single run. To validate the accuracy of our system, we tested our method on two datasets, one from the PROSTATEx Challenge and the other from our local cohort. Our method can achieve average area-under-the-curve (AUC) of 0.912 and 0.882 on the two datasets respectively. The proposed approach present a promising tool for radiologists in their clinical practices.
Persistent Identifierhttp://hdl.handle.net/10722/277512
ISBN
ISSN
2020 SCImago Journal Rankings: 0.249

 

DC FieldValueLanguage
dc.contributor.authorLiu, Z-
dc.contributor.authorJiang, W-
dc.contributor.authorLee, KH-
dc.contributor.authorLo, YL-
dc.contributor.authorNg, YL-
dc.contributor.authorDou, Q-
dc.contributor.authorVardhanabhuti, V-
dc.contributor.authorKwok, KW-
dc.date.accessioned2019-09-20T08:52:29Z-
dc.date.available2019-09-20T08:52:29Z-
dc.date.issued2019-
dc.identifier.citationProceedings of the First International Workshop on Artificial Intelligence in Radiation Therapy, AIRT 2019, held in conjunction with the 22nd Medical Image Computing and Computer Assisted Intervention (MICCAI) International Conference 2019, Shenzhen, China, 17 October 2019, p. 43-51-
dc.identifier.isbn978-3-030-32485-8-
dc.identifier.issn0302-9743-
dc.identifier.urihttp://hdl.handle.net/10722/277512-
dc.description.abstractEarly diagnosis of prostate cancer is very crucial to reduce the mortality rate. Multi-parametric magnetic resonance imaging (MRI) can provide detailed visualization of prostate tissues and lesions. Their malignancy can be diagnosed before any necessary invasive approaches, such as needle biopsy, at the risk of damage to or inflammation of the periprostatic nerves, prostate and bladder neck. However, the prostate tissue malignancy on magnetic resonance (MR) images can also be difficult to determine, with often inconclusive results among the clinicians. With the progress in artificial intelligence (AI), research on MR image-based lesion classification with AI tools are being explored increasingly. So far, existing classification approaches heavily rely on manually labelling of lesion areas, which is a labor-intensive and time-consuming process. In this paper, we present a novel two-stage method for fully-automated prostate lesion detection and classification, using input sequences of T2-weighted images, apparent diffusion coefficient (ADC) maps and high b-value diffusion-weighted images. In the first stage, a Mask R-CNN model is trained to automatically segment prostate structures. In the second stage, a weakly supervised deep neural network is developed to detect and classify lesions in a single run. To validate the accuracy of our system, we tested our method on two datasets, one from the PROSTATEx Challenge and the other from our local cohort. Our method can achieve average area-under-the-curve (AUC) of 0.912 and 0.882 on the two datasets respectively. The proposed approach present a promising tool for radiologists in their clinical practices.-
dc.languageeng-
dc.publisherSpringer.-
dc.relation.ispartofArtificial intelligence in radiation therapy: first International Workshop, AIRT 2019, held in conjunction with MICCAI 2019, Shenzhen, China, October 17, 2019, Proceedings-
dc.relation.ispartofLecture Notes in Computer Science, vol 11850-
dc.subjectProstate cancer-
dc.subjectMR images classification-
dc.subjectWeakly supervised learning-
dc.titleA Two-Stage Approach for Automated Prostate Lesion Detection and Classification with Mask R-CNN and Weakly Supervised Deep Neural Network-
dc.typeConference_Paper-
dc.identifier.emailVardhanabhuti, V: varv@hku.hk-
dc.identifier.emailKwok, KW: kwokkw@hku.hk-
dc.identifier.authorityVardhanabhuti, V=rp01900-
dc.identifier.authorityKwok, KW=rp01924-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1007/978-3-030-32486-5_6-
dc.identifier.scopuseid_2-s2.0-85075666093-
dc.identifier.hkuros305514-
dc.identifier.spage43-
dc.identifier.epage51-
dc.identifier.eissn1611-3349-
dc.publisher.placeCham-
dc.identifier.issnl0302-9743-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats