File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/ISBI53787.2023.10230804
- Scopus: eid_2-s2.0-85172127093
- WOS: WOS:001062050500481
- Find via
Supplementary
- Citations:
- Appears in Collections:
Conference Paper: TRANSFORMER-BASED MULTIMODAL FUSION FOR SURVIVAL PREDICTION BY INTEGRATING WHOLE SLIDE IMAGES, CLINICAL, AND GENOMIC DATA
Title | TRANSFORMER-BASED MULTIMODAL FUSION FOR SURVIVAL PREDICTION BY INTEGRATING WHOLE SLIDE IMAGES, CLINICAL, AND GENOMIC DATA |
---|---|
Authors | |
Keywords | Graph Neural Network Multi-modality Survival Prediction Transformer Whole Slide Image |
Issue Date | 18-Apr-2023 |
Publisher | IEEE |
Abstract | Survival prediction using whole slide images (WSIs) is a complex and difficult task, as handling gigapixel WSI directly is computationally impossible. In the past few years, people have worked out multiple instance learning (MIL) strategies to deal with WSIs, i.e., splitting WSI into many patches (instances) and aggregating features across patches. Moreover, to better predict the survival outcome of patients, different modalities have been explored, among which gene features are used the most frequently. In this paper, we explore a graph-based strategy to handle WSIs and investigate a transformer-based strategy to combine different modalities for survival prediction. Moreover, clinical data was also adopted and different encoding manners of clinical information were explored. Experiments on two public datasets from The Cancer Genome Atlas (TCGA) demonstrate the effectiveness of the proposed graph-transformer framework for survival prediction. |
Persistent Identifier | http://hdl.handle.net/10722/340956 |
ISSN | 2020 SCImago Journal Rankings: 0.601 |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Chen, YH | - |
dc.contributor.author | Zhao, WQ | - |
dc.contributor.author | Yu, LQ | - |
dc.date.accessioned | 2024-03-11T10:48:34Z | - |
dc.date.available | 2024-03-11T10:48:34Z | - |
dc.date.issued | 2023-04-18 | - |
dc.identifier.issn | 1945-7928 | - |
dc.identifier.uri | http://hdl.handle.net/10722/340956 | - |
dc.description.abstract | <p>Survival prediction using whole slide images (WSIs) is a complex and difficult task, as handling gigapixel WSI directly is computationally impossible. In the past few years, people have worked out multiple instance learning (MIL) strategies to deal with WSIs, i.e., splitting WSI into many patches (instances) and aggregating features across patches. Moreover, to better predict the survival outcome of patients, different modalities have been explored, among which gene features are used the most frequently. In this paper, we explore a graph-based strategy to handle WSIs and investigate a transformer-based strategy to combine different modalities for survival prediction. Moreover, clinical data was also adopted and different encoding manners of clinical information were explored. Experiments on two public datasets from The Cancer Genome Atlas (TCGA) demonstrate the effectiveness of the proposed graph-transformer framework for survival prediction.</p> | - |
dc.language | eng | - |
dc.publisher | IEEE | - |
dc.relation.ispartof | The IEEE International Symposium on Biomedical Imaging | - |
dc.subject | Graph Neural Network | - |
dc.subject | Multi-modality | - |
dc.subject | Survival Prediction | - |
dc.subject | Transformer | - |
dc.subject | Whole Slide Image | - |
dc.title | TRANSFORMER-BASED MULTIMODAL FUSION FOR SURVIVAL PREDICTION BY INTEGRATING WHOLE SLIDE IMAGES, CLINICAL, AND GENOMIC DATA | - |
dc.type | Conference_Paper | - |
dc.identifier.doi | 10.1109/ISBI53787.2023.10230804 | - |
dc.identifier.scopus | eid_2-s2.0-85172127093 | - |
dc.identifier.volume | 2023-April | - |
dc.identifier.isi | WOS:001062050500481 | - |
dc.publisher.place | NEW YORK | - |
dc.identifier.eisbn | 978-1-6654-7358-3 | - |
dc.identifier.issnl | 1945-7928 | - |