File Download

There are no files associated with this item.

  Links for fulltext
     (May Require Subscription)
Supplementary

Conference Paper: TW-TGNN: Two Windows Graph-Based Model for Text Classification

TitleTW-TGNN: Two Windows Graph-Based Model for Text Classification
Authors
KeywordsText classification
Graph neural network
Representation learning
Issue Date2021
PublisherIEEE. The Journal's web site is located at http://ieeexplore.ieee.org/xpl/conhome.jsp?punumber=1000500
Citation
2021 International Joint Conference on Neural Networks (IJCNN), Shenzhen, China, 18-22 July 2021, p. 1-8 How to Cite?
AbstractText classification is the most fundamental and classical task in the natural language processing (NLP). Recently, graph neural network (GNN) methods, especially the graph-based model, have been applied for solving this issue because of their superior capacity of capturing the global co-occurrence information. However, some existing GNN-based methods adopt a corpus-level graph structure which causes a high memory consumption. In addition, these methods have not taken account of the global co-occurrence information and local semantic information at the same time. To address these problems, we propose a new GNN-based model, namely two windows text gnn model (TW-TGNN), for text classification. More specifically, we build text-level graph for each text with a local sliding window and a dynamic global window. For one thing, the local window sliding inside the text will acquire enough local semantic features. For another, the dynamic global window sliding betweent texts can generate dynamic shared weight matrix, which overcomes the limitation of the fixed corpus level co-occurrence and provides richer dynamic global information. Our experimental results on four benchmark datasets illustrate the improvement of the proposed method over state-of-the-art text classification methods. Moreover, we find that our method captures adequate global information for the short text which is beneficial for overcoming the insufficient contextual information in the process of the short text classification.
Persistent Identifierhttp://hdl.handle.net/10722/305718
ISSN
ISI Accession Number ID

 

DC FieldValueLanguage
dc.contributor.authorWu, X-
dc.contributor.authorLuo, Z-
dc.contributor.authorDu, Z-
dc.contributor.authorWang, J-
dc.contributor.authorGao, C-
dc.contributor.authorLi, X-
dc.date.accessioned2021-10-20T10:13:21Z-
dc.date.available2021-10-20T10:13:21Z-
dc.date.issued2021-
dc.identifier.citation2021 International Joint Conference on Neural Networks (IJCNN), Shenzhen, China, 18-22 July 2021, p. 1-8-
dc.identifier.issn2161-4393-
dc.identifier.urihttp://hdl.handle.net/10722/305718-
dc.description.abstractText classification is the most fundamental and classical task in the natural language processing (NLP). Recently, graph neural network (GNN) methods, especially the graph-based model, have been applied for solving this issue because of their superior capacity of capturing the global co-occurrence information. However, some existing GNN-based methods adopt a corpus-level graph structure which causes a high memory consumption. In addition, these methods have not taken account of the global co-occurrence information and local semantic information at the same time. To address these problems, we propose a new GNN-based model, namely two windows text gnn model (TW-TGNN), for text classification. More specifically, we build text-level graph for each text with a local sliding window and a dynamic global window. For one thing, the local window sliding inside the text will acquire enough local semantic features. For another, the dynamic global window sliding betweent texts can generate dynamic shared weight matrix, which overcomes the limitation of the fixed corpus level co-occurrence and provides richer dynamic global information. Our experimental results on four benchmark datasets illustrate the improvement of the proposed method over state-of-the-art text classification methods. Moreover, we find that our method captures adequate global information for the short text which is beneficial for overcoming the insufficient contextual information in the process of the short text classification.-
dc.languageeng-
dc.publisherIEEE. The Journal's web site is located at http://ieeexplore.ieee.org/xpl/conhome.jsp?punumber=1000500-
dc.relation.ispartofInternational Joint Conference on Neural Networks (IJCNN)-
dc.rightsInternational Joint Conference on Neural Networks (IJCNN). Copyright © IEEE.-
dc.rights©2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.-
dc.subjectText classification-
dc.subjectGraph neural network-
dc.subjectRepresentation learning-
dc.titleTW-TGNN: Two Windows Graph-Based Model for Text Classification-
dc.typeConference_Paper-
dc.identifier.emailDu, Z: zwdu@hku.hk-
dc.identifier.authorityDu, Z=rp02777-
dc.description.naturelink_to_subscribed_fulltext-
dc.identifier.doi10.1109/ijcnn52387.2021.9534150-
dc.identifier.scopuseid_2-s2.0-85116409182-
dc.identifier.hkuros327517-
dc.identifier.spage1-
dc.identifier.epage8-
dc.identifier.isiWOS:000722581706102-
dc.publisher.placeUnited States-

Export via OAI-PMH Interface in XML Formats


OR


Export to Other Non-XML Formats