File Download
There are no files associated with this item.
Links for fulltext
(May Require Subscription)
- Publisher Website: 10.1109/ijcnn52387.2021.9534150
- Scopus: eid_2-s2.0-85116409182
- WOS: WOS:000722581706102
- Find via
Supplementary
- Citations:
- Appears in Collections:
Conference Paper: TW-TGNN: Two Windows Graph-Based Model for Text Classification
Title | TW-TGNN: Two Windows Graph-Based Model for Text Classification |
---|---|
Authors | |
Keywords | Text classification Graph neural network Representation learning |
Issue Date | 2021 |
Publisher | IEEE. The Journal's web site is located at http://ieeexplore.ieee.org/xpl/conhome.jsp?punumber=1000500 |
Citation | 2021 International Joint Conference on Neural Networks (IJCNN), Shenzhen, China, 18-22 July 2021, p. 1-8 How to Cite? |
Abstract | Text classification is the most fundamental and classical task in the natural language processing (NLP). Recently, graph neural network (GNN) methods, especially the graph-based model, have been applied for solving this issue because of their superior capacity of capturing the global co-occurrence information. However, some existing GNN-based methods adopt a corpus-level graph structure which causes a high memory consumption. In addition, these methods have not taken account of the global co-occurrence information and local semantic information at the same time. To address these problems, we propose a new GNN-based model, namely two windows text gnn model (TW-TGNN), for text classification. More specifically, we build text-level graph for each text with a local sliding window and a dynamic global window. For one thing, the local window sliding inside the text will acquire enough local semantic features. For another, the dynamic global window sliding betweent texts can generate dynamic shared weight matrix, which overcomes the limitation of the fixed corpus level co-occurrence and provides richer dynamic global information. Our experimental results on four benchmark datasets illustrate the improvement of the proposed method over state-of-the-art text classification methods. Moreover, we find that our method captures adequate global information for the short text which is beneficial for overcoming the insufficient contextual information in the process of the short text classification. |
Persistent Identifier | http://hdl.handle.net/10722/305718 |
ISSN | |
ISI Accession Number ID |
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Wu, X | - |
dc.contributor.author | Luo, Z | - |
dc.contributor.author | Du, Z | - |
dc.contributor.author | Wang, J | - |
dc.contributor.author | Gao, C | - |
dc.contributor.author | Li, X | - |
dc.date.accessioned | 2021-10-20T10:13:21Z | - |
dc.date.available | 2021-10-20T10:13:21Z | - |
dc.date.issued | 2021 | - |
dc.identifier.citation | 2021 International Joint Conference on Neural Networks (IJCNN), Shenzhen, China, 18-22 July 2021, p. 1-8 | - |
dc.identifier.issn | 2161-4393 | - |
dc.identifier.uri | http://hdl.handle.net/10722/305718 | - |
dc.description.abstract | Text classification is the most fundamental and classical task in the natural language processing (NLP). Recently, graph neural network (GNN) methods, especially the graph-based model, have been applied for solving this issue because of their superior capacity of capturing the global co-occurrence information. However, some existing GNN-based methods adopt a corpus-level graph structure which causes a high memory consumption. In addition, these methods have not taken account of the global co-occurrence information and local semantic information at the same time. To address these problems, we propose a new GNN-based model, namely two windows text gnn model (TW-TGNN), for text classification. More specifically, we build text-level graph for each text with a local sliding window and a dynamic global window. For one thing, the local window sliding inside the text will acquire enough local semantic features. For another, the dynamic global window sliding betweent texts can generate dynamic shared weight matrix, which overcomes the limitation of the fixed corpus level co-occurrence and provides richer dynamic global information. Our experimental results on four benchmark datasets illustrate the improvement of the proposed method over state-of-the-art text classification methods. Moreover, we find that our method captures adequate global information for the short text which is beneficial for overcoming the insufficient contextual information in the process of the short text classification. | - |
dc.language | eng | - |
dc.publisher | IEEE. The Journal's web site is located at http://ieeexplore.ieee.org/xpl/conhome.jsp?punumber=1000500 | - |
dc.relation.ispartof | International Joint Conference on Neural Networks (IJCNN) | - |
dc.rights | International Joint Conference on Neural Networks (IJCNN). Copyright © IEEE. | - |
dc.rights | ©2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. | - |
dc.subject | Text classification | - |
dc.subject | Graph neural network | - |
dc.subject | Representation learning | - |
dc.title | TW-TGNN: Two Windows Graph-Based Model for Text Classification | - |
dc.type | Conference_Paper | - |
dc.identifier.email | Du, Z: zwdu@hku.hk | - |
dc.identifier.authority | Du, Z=rp02777 | - |
dc.description.nature | link_to_subscribed_fulltext | - |
dc.identifier.doi | 10.1109/ijcnn52387.2021.9534150 | - |
dc.identifier.scopus | eid_2-s2.0-85116409182 | - |
dc.identifier.hkuros | 327517 | - |
dc.identifier.spage | 1 | - |
dc.identifier.epage | 8 | - |
dc.identifier.isi | WOS:000722581706102 | - |
dc.publisher.place | United States | - |